C# Async/Await & Concurrency Training

Mastering Async Programming for Scalable Applications

πŸ“š Key Learning Objectives:

βœ“ Understanding the thread starvation problem

βœ“ Mastering async/await syntax and state machines

βœ“ Avoiding deadlocks and common pitfalls

βœ“ Best practices for scalable applications

Training Duration: ~90 minutes | Level: Junior to Mid-Level

πŸ‘¨β€πŸ« Instructor: Abubakr Bakhromov

πŸ“˜ Primary Reference: Microsoft Docs - Async Programming

The Thread Starvation Problem

Why Threads Are Precious Resources

When a thread handles an I/O request (HTTP, database, file), it blocks and waits. During this time, the thread cannot serve other requests.

⚠️ The Problem: A web server with 100 threads handling 100 blocking I/O operations = zero capacity for request #101.
// Blocking sync: Thread waits doing nothing var content = File.ReadAllText(path); // Thread blocked 100ms! // Non-blocking async: Thread released immediately var content = await File.ReadAllTextAsync(path); // Thread freed!

How Async Solves This

Async allows a thread to initiate an I/O operation and immediately return to the pool. When I/O completes, any available thread continues the work.

🎯 Why This Matters: Under heavy load, sync servers hit thread starvation (all threads blocked). Async servers keep threads free to accept new requests.
πŸ’‘ Real Impact: With async, 2-3 threads can handle thousands of concurrent requests because threads never sit idle.

πŸ“˜ ASP.NET Core Performance Best Practices

Threads, ThreadPool & Context Switching

Thread: Expensive OS Resource

β€’ Each thread consumes ~1MB of stack memory

β€’ Requires kernel-mode transitions for scheduling

β€’ Creating thousands of threads = memory exhaustion + CPU thrashing

ThreadPool: Managed Solution

ASP.NET Core uses a ThreadPool with a dynamically-sized set of worker threads. Work items queue and execute on available threads.

// ❌ DON'T - Manual thread (unmanaged) new Thread(() => DoWork()).Start(); // 1MB overhead, no management // βœ… DO - Use ThreadPool via Task await Task.Run(() => DoWork()); // Managed, reusable threads

Context Switching: The Hidden Cost

What happens: When CPU switches threads, it must save/restore registers, stack pointers, and memory mappings. More threads = more switching = higher overhead.

🎯 Why this matters: ThreadPool maintains optimal thread count for your CPU cores. Manual threads bypass this intelligence, leading to contention and poor performance.
πŸ’‘ Golden Rule: Never create threads manually. Let the ThreadPool manage thread lifecycle.

πŸ“˜ Managed Thread Pool

I/O-Bound vs. CPU-Bound: The Critical Distinction

Understanding the Difference

I/O-Bound: Thread waits for external resource β†’ Thread is idle

CPU-Bound: Thread performs computation β†’ Thread is busy

Type Examples Right Approach & Why
I/O-Bound HTTP calls, DB queries, File I/O async/await - Frees thread during wait
CPU-Bound Image processing, Encryption, Parsing Task.Run or Parallel - Offloads work
// I/O-Bound: Thread would be IDLE waiting for DB var users = await db.Users.ToListAsync(); // βœ… Correct // CPU-Bound: Thread is BUSY computing var hash = ComputeSHA256(data); // βœ… Sync is fine // ❌ WRONG: Wrapping I/O in Task.Run wastes thread var users = await Task.Run(() => db.Users.ToList()); // Thread still waits!
🎯 Why Task.Run for I/O is wrong: You're using a pool thread to block on I/O. That thread could serve other requests. Always use native async I/O methods.

πŸ“˜ Async in Depth

Deep Dive: How Non-Blocking I/O Actually Works

The Low-Level Mechanism

Step 1: Application calls async I/O method (e.g., GetAsync)

Step 2: .NET passes request to OS kernel via I/O Completion Ports (IOCP)

Step 3: Thread is released back to pool (doesn't wait)

Step 4: OS kernel/device handles I/O asynchronously

Step 5: When complete, IOCP signals a completion packet

Step 6: Any available thread picks up completion and resumes

ℹ️ Key Insight: The thread doesn't poll or sleep. The OS-level IOCP efficiently manages thousands of concurrent I/O operations with zero threads.

Why This Architecture Scales

// 1000 concurrent HTTP requests var tasks = urls.Select(url => httpClient.GetAsync(url)).ToArray(); await Task.WhenAll(tasks); // Threads used: ~2-3 (for continuations) // I/O operations: 1000 (handled by IOCP, not threads)
🎯 Why it's so efficient: IOCP is a kernel-level mechanism. It doesn't use threads to waitβ€”it uses hardware interrupts. When I/O completes, the device signals the CPU directly.

πŸ“˜ I/O Completion Ports (Windows)

The Four Execution Models

Model Description Behavior
Synchronous A β†’ B β†’ C (Blocking) Each operation completes before next starts
Asynchronous A (pause) β†’ B (resume) β†’ C Thread released during wait, non-blocking
Concurrent Interleaving work Multiple tasks progress (may be 1 thread)
Parallel Simultaneous work Multiple cores executing at exact same time

Code Examples

// Synchronous - Takes 6 seconds total var user1 = GetUser(1); // 2 seconds var user2 = GetUser(2); // 2 seconds var user3 = GetUser(3); // 2 seconds // Asynchronous Concurrent - Takes 2 seconds total var task1 = GetUserAsync(1); // Start var task2 = GetUserAsync(2); // Start var task3 = GetUserAsync(3); // Start await Task.WhenAll(task1, task2, task3); // All running concurrently // Parallel - CPU-bound, uses multiple cores Parallel.For(0, 1000, i => ProcessItem(items[i]));
🎯 Key Distinction: Async β‰  Parallel. Async can be concurrent on a single thread (context switching). Parallel requires multiple CPU cores running simultaneously.

πŸ“˜ Asynchronous Programming Patterns

Demo: The Blocking Effect (Synchronous)

ASP.NET Core Endpoint - Blocking Version

[HttpGet("blocking")] public IActionResult GetBlocking() { Thread.Sleep(5000); // Simulates I/O: blocks thread for 5 seconds return Ok("Hello"); }
⚠️ Capacity Analysis:
β€’ ThreadPool Size: 100 threads
β€’ Request Duration: 5 seconds (blocking)
β€’ Max Throughput: 100 / 5 = 20 requests/second
β€’ Request #101: Must wait in queue

Why This Doesn't Scale

With 1000 concurrent users, 980 requests are queued or timing out. Each blocked thread is a wasted resource that could serve other requests.

// Under load with 1000 concurrent requests: // Requests 1-100: Being processed (threads blocked) // Requests 101-1000: Waiting or timing out // Result: Poor UX, server appears "slow"
🎯 Why this fails: Blocking threads on I/O is like parking cars in drive-through lanesβ€”the lane is occupied but no work is happening. You need more lanes (threads) for the same throughput.

πŸ“˜ ASP.NET Core Performance

Demo: The Async Fix (Non-Blocking)

ASP.NET Core Endpoint - Async Version

[HttpGet("async")] public async Task<IActionResult> GetAsync() { await Task.Delay(5000); // Simulates I/O: releases thread return Ok("Hello"); }
βœ… Capacity Analysis (Async):
β€’ ThreadPool Size: 100 threads
β€’ Request Duration: 5 seconds (non-blocking)
β€’ Max Throughput: Thousands of requests/second
β€’ Request #1001: No problem! Threads handle it during I/O waits

Why This Scales

Threads are freed during await Task.Delay. Those same 100 threads juggle thousands of concurrent requests by handling them in segments (before await, after await).

// Under load with 1000 concurrent requests: // Threads actively used: ~2-10 (for continuations) // Requests in flight: 1000 (all progressing) // Requests blocked: 0 // Result: All users get fast responses
🎯 Real-World Impact: A production team switched from sync to async and went from 200 req/sec to 5000 req/sec on the same hardware. That's 25x improvement!

πŸ“˜ ASP.NET Core Middleware

The Core Syntax: async & await

The Two Keywords Explained

async - Method modifier that enables await usage

await - Operator that suspends execution and returns control

public async Task<string> FetchDataAsync() { // Line 1: Executes synchronously Console.WriteLine("Starting..."); // Line 2: PAUSE HERE - thread released var response = await httpClient.GetAsync(url); // Line 3: RESUME HERE - when I/O completes (may be different thread) Console.WriteLine("Received response"); return await response.Content.ReadAsStringAsync(); }

What Actually Happens

1. Method starts executing synchronously until first await

2. At await, method pauses and returns a Task to caller

3. Thread is freed to do other work

4. When awaited operation completes, execution resumes after the await

🎯 Why "async all the way": If you mix sync/async (calling async method without await), you lose all benefits. The calling thread still blocks waiting for result.

πŸ“˜ await operator

Return Types & Method Signatures

Valid Return Types for Async Methods

Task - For methods that perform work but don't return a value

Task<T> - For methods that return a value of type T

void - ONLY for event handlers (explained later)

ValueTask<T> - Performance optimization (advanced)

// Returns Task (no value) public async Task SendEmailAsync(string to, string subject) { await smtpClient.SendAsync(to, subject); // No return statement needed } // Returns Task<string> (returns string value) public async Task<string> GetUserNameAsync(int id) { var user = await _db.Users.FindAsync(id); return user.Name; // Returns string, but signature is Task<string> }

Why Task<T> Instead of Just T?

The Task represents the future result. When you call an async method, it returns immediately with a Task (a promise), not the actual result. The result arrives later when you await it.

⚠️ Common Mistake: You return string in the body, but declare Task<string> in signature. Compiler wraps your return value automatically.
🎯 Why Task wrapping: Tasks allow the caller to chain operations, handle errors, and await completion. They're the standardized way to represent "work in progress."

πŸ“˜ Async Return Types

Execution Flow: Step-by-Step Walkthrough

What Happens When You Await

public async Task<Result> WorkAsync() { Console.WriteLine($"Start: Thread {Thread.CurrentThread.ManagedThreadId}"); var data = await FetchAsync(); // PAUSE 1 Console.WriteLine($"After fetch: Thread {Thread.CurrentThread.ManagedThreadId}"); var result = await ProcessAsync(data); // PAUSE 2 Console.WriteLine($"After process: Thread {Thread.CurrentThread.ManagedThreadId}"); return result; }

Trace the Execution

β€’ Line 1-2: Executes synchronously on caller's thread (say, thread 5)

β€’ Line 3 (await FetchAsync): Pauses, thread 5 released

β€’ Network I/O happens... (thread 5 handles other requests)

β€’ I/O completes: Continuation scheduled (may be thread 7)

β€’ Line 4-5: Resumes on thread 7, may print different thread ID

β€’ Line 6: Pauses again, thread 7 released

β€’ Resume: May continue on thread 3

🎯 Why different threads: The ThreadPool assigns any available thread for the continuation. The important part: threads are never blocking/waiting.
ℹ️ Key Insight: Thread IDs CAN differ across awaits! This is normal. The thread is doing useful work, not waiting.

πŸ“˜ Consuming TAP

The Compiler State Machine (Part 1)

High-Level Explanation

The compiler rewrites your async method into a state machine class that implements IAsyncStateMachine.

Why? The method must be able to pause, store its state, and resume later. State machines enable this behavior.

Each await = A State Transition

// Your code var data = await FetchAsync(); // State 0 var result = await ProcessAsync(data); // State 1 return result; // State 2 (final)
πŸ’‘ Concept: The compiler transforms each await into a state. The state machine knows which state it's in and what to execute next.

The Compiler State Machine (Part 2)

Deeper Dive: How It Works

Local variables become fields: Your method's local variables are stored as fields in the state machine class. This allows them to persist across await points.

MoveNext() drives execution: The MoveNext() method is called to advance the state machine.

// Compiler-generated (simplified concept) class <MyAsyncMethod_StateMachine> { int state = 0; object data; // Local variables become fields object result; void MoveNext() { switch (state) { case 0: data = await FetchAsync(); state = 1; break; case 1: result = await ProcessAsync(data); state = 2; break; case 2: return result; } } }
ℹ️ Why This Matters: This persistence is why local variables maintain their values across awaits. They're actually fields, not stack variables.

Tasks & Task Objects

What a Task Represents

A Task is a promise of work that will complete in the future.

Key Properties

IsCompleted - True when the task has finished (successfully or not)

IsFaulted - True if the task threw an exception

Result - The result value (BLOCKS if not complete!)

⚠️ WARNING: Never access Task.Result or call Task.Wait(). Both block the thread and can cause deadlocks!
// ❌ WRONG - Blocks thread var result = GetDataAsync().Result; // βœ… CORRECT - Non-blocking var result = await GetDataAsync();

TaskCompletionSource: The Bridge

Purpose

Manually create a Task to wrap non-TAP APIs (events, old callbacks, legacy code).

How It Works

SetResult(value) - Marks the task as completed with a result

SetException(ex) - Marks the task as faulted

SetCanceled() - Marks the task as canceled

public Task<string> WrapLegacyEventAsync() { var tcs = new TaskCompletionSource<string>(); // Hook into legacy event LegacyClass.DataReady += (data) => tcs.SetResult(data); LegacyClass.Error += (ex) => tcs.SetException(ex); LegacyClass.StartOperation(); return tcs.Task; // Now awaitable! } // Usage var result = await WrapLegacyEventAsync();

ValueTask: Allocation Optimization

The Problem ValueTask Solves

Every Task<T> is a heap-allocated object. For methods that complete synchronously (cached data), this allocation is wasteful.

ValueTask<T>: A Struct Alternative

ValueTask<T> is a struct (value type). If the result is immediately available, no heap allocation occurs.

private Dictionary<int, User> _cache = new(); public ValueTask<User> GetUserAsync(int id) { // Fast path: cached result, no allocation if (_cache.TryGetValue(id, out var user)) return new ValueTask<User>(user); // Slow path: async I/O return new ValueTask<User>(FetchFromDbAsync(id)); }
⚠️ Trade-off: Can only be awaited once. More complex than Task. Use only after profiling shows allocation is a bottleneck.

SynchronizationContext

What Is It?

A scheduler that determines where a continuation runs after an await.

Platform Differences

UI apps (WPF, WinForms): Have a UI SynchronizationContext (must resume on UI thread)

Classic ASP.NET: Has AspNetSynchronizationContext (per-request context)

ASP.NET Core: No SynchronizationContext (null)

Console apps: No SynchronizationContext (null)

ℹ️ Why It Matters: By default, await tries to resume on the original SynchronizationContext. This can cause deadlocks if that context is blocked.

Visual: Diagram showing a SyncContext boundary and how continuations resume on it.

ExecutionContext

What Is It?

Contextual data that automatically flows across async operations. Unlike SynchronizationContext (which is platform-specific), ExecutionContext flows in all .NET applications.

What Flows in ExecutionContext?

1. Thread.CurrentCulture and Thread.CurrentUICulture

Culture settings for formatting dates, numbers, and UI text. Ensures your async code respects user's locale settings even when resuming on different threads.

2. AsyncLocal<T> values

Thread-local storage that flows across async boundaries. Used for correlation IDs, request context, logging context, etc. Each async flow gets its own isolated copy.

3. Security principal/identity

The current user's identity (Thread.CurrentPrincipal). Critical for authorization checks in async code - the user context is preserved across awaits.

4. Other ambient data

Impersonation context, host execution context, and other framework-specific data that needs to flow with async operations.

ℹ️ Key Point: ExecutionContext is separate from SynchronizationContext. ExecutionContext flows automatically in all apps (console, web, UI); you rarely need to think about it.
πŸ’‘ Guarantee: These values are preserved across awaits, even when resuming on a different thread. The .NET runtime handles this automatically.

ConfigureAwait(false) Mechanics

What Is SynchronizationContext?

A scheduler that determines where a continuation runs after an await. By default, await tries to resume on the original SynchronizationContext.

Platform Differences:

β€’ UI apps (WPF, WinForms): Have UI SynchronizationContext (must resume on UI thread)

β€’ Classic ASP.NET: Has AspNetSynchronizationContext (per-request context)

β€’ ASP.NET Core: No SynchronizationContext (null)

β€’ Console apps: No SynchronizationContext (null)

How ConfigureAwait(false) Works

// Default behavior var data = await GetDataAsync(); // Tries to resume on original context // With ConfigureAwait(false) var data = await GetDataAsync().ConfigureAwait(false); // Resumes on any available thread pool thread
🎯 Why this matters: In UI apps, if the UI thread is blocked waiting for a task, and that task tries to resume on the UI thread, you get DEADLOCK. ConfigureAwait(false) prevents this.

πŸ“˜ Task.ConfigureAwait Method

ConfigureAwait(false) Practical Rules

Rule 1: Library/Service Code

ALWAYS use ConfigureAwait(false) in library code and service layers to prevent deadlocks for callers.

// Library/service layer code public async Task<Data> GetDataAsync() { var response = await httpClient.GetAsync(url) .ConfigureAwait(false); // Don't care about context return await response.Content.ReadAsAsync<Data>() .ConfigureAwait(false); }
🎯 Why in libraries: You don't know your caller's context. If they have a SynchronizationContext and block on your method, you'd deadlock them. ConfigureAwait(false) makes your library safe.

Rule 2: Top-Level/UI Code

OMIT ConfigureAwait(false) (or use true) to ensure resumption on the UI thread/main context.

// UI event handler private async void OnButtonClick(object sender, EventArgs e) { var data = await GetDataAsync(); // NO ConfigureAwait(false) textBox.Text = data; // Must run on UI thread }
🎯 Why omit in UI code: UI controls can ONLY be accessed from the UI thread. If you use ConfigureAwait(false), you'd resume on a background thread and crash when accessing textBox.

ASP.NET Core Exception

ASP.NET Core has no SynchronizationContext, so ConfigureAwait(false) is optional (but still good practice for consistency).

πŸ“˜ ConfigureAwait FAQ - Stephen Toub

The Deadlock Trap (Classic Scenario)

The Classic Deadlock

Classic ASP.NET or UI app: A method blocks on an async call using .Result or .Wait().

// ❌ DEADLOCK SCENARIO public ActionResult MyAction() // Running on UI/request context { var user = GetUserAsync().Result; // BLOCKS, awaits resumption return View(user); // Never reached! }
⚠️ Deadlock Cycle:
1. Thread calls .Result on UI context
2. Thread blocks waiting for Task
3. Task completes, tries to resume on UI context
4. UI context can't run (blocked by thread)
5. DEADLOCK - mutual waiting
🎯 Why this deadlocks: UI apps have a single-threaded SynchronizationContext. The blocked thread holds the only way to run continuations. The continuation needs that thread to run. Neither can proceed.

Where This Happens

β€’ WPF/WinForms UI apps (single UI thread)

β€’ Classic ASP.NET (per-request context)

β€’ NOT ASP.NET Core (no SynchronizationContext)

πŸ“˜ Don't Block on Async Code - Stephen Cleary

The Deadlock Code (The Anti-Pattern)

The Dangerous Code

// ❌ DEADLOCK ANTI-PATTERN public ActionResult MyAction() { var user = GetUserAsync().Result; // BLOCKS return View(user); } private async Task<User> GetUserAsync() { return await _db.Users.FirstOrDefaultAsync(); // Tries to resume on the blocked context }

Why It Deadlocks

β€’ Main thread blocks on .Result

β€’ SynchronizationContext is now blocked

β€’ GetUserAsync() completes and tries to resume

β€’ Can't resume because context is blocked

β€’ Main thread can't unblock because it's waiting

⚠️ This code will hang forever in UI apps and Classic ASP.NET

Fix 1: The Principle of Async All The Way

The Primary Solution

Refactor the calling method to be async and use await. No blocking, no deadlock.

// βœ… CORRECT - Async all the way public async Task<ActionResult> MyActionAsync() { var user = await GetUserAsync(); // No blocking! return View(user); } private async Task<User> GetUserAsync() { return await _db.Users.FirstOrDefaultAsync(); }
βœ… Why This Works: No thread ever blocks. When GetUserAsync() completes, the continuation runs normally on the context.

Fix 2: The Library Fallback

Secondary Solution

Use ConfigureAwait(false) in library/lower layers. Only if you cannot control the caller.

// βœ… Library code with ConfigureAwait(false) private async Task<User> GetUserAsync() { var user = await _db.Users .FirstOrDefaultAsync() .ConfigureAwait(false); // Don't capture context return user; }

Why This Helps

The continuation doesn't try to resume on the blocked context. It runs on any thread pool thread.

⚠️ Important: Fix 1 (async all the way) is always better. Fix 2 just makes the code more tolerant of bad callers.

Deadlocks in ASP.NET Core?

Lower Deadlock Risk

ASP.NET Core has no SynchronizationContext. This eliminates the classic deadlock scenario.

Classic ASP.NET ASP.NET Core
SyncContext Yes (per-request) No (null)
.Result Deadlock? Yes (high risk) No (low risk)
Still Bad? Yes Yes!
⚠️ Even in ASP.NET Core: Blocking with .Result still wastes ThreadPool threads. It's about throughput, not deadlocks.
πŸ’‘ Hint: Still avoid .Result/.Wait() even in ASP.NET Core.

Fire-and-Forget: The Silent Killer

The Anti-Pattern

// ❌ WRONG - Exception is LOST public void OnButtonClick() { _ = SendEmailAsync(); // Fire and forget } // If SendEmailAsync throws, nobody knows!

Why It's Dangerous

β€’ Exception thrown in the async method is never observed

β€’ App may silently fail

β€’ You can't tell when the work completes

β€’ Hard to debug

ℹ️ Better Approach: Make caller async and await properly.

Handling Background Work Properly

Solution 1: Make Caller Async

public async Task OnButtonClickAsync() { await SendEmailAsync(); // Properly awaited }

Solution 2: Use IHostedService (ASP.NET Core)

public void OnButtonClick() { _backgroundQueue.QueueBackgroundWorkItem(async ct => { try { await SendEmailAsync(); } catch (Exception ex) { _logger.LogError(ex, "Email send failed"); } }); }
πŸ’‘ Hint: If you MUST fire-and-forget, log exceptions inside the task body.

Async in Constructors (The Factory Pattern)

The Problem

Constructors cannot be async. A constructor must synchronously return a fully initialized object.

// ❌ ILLEGAL - Compiler error public async MyService() { Data = await FetchAsync(); }

Solution: Factory Method

public class MyService { private MyService() { } // Private constructor public static async Task<MyService> CreateAsync() { var instance = new MyService(); instance.Data = await FetchAsync(); return instance; } } // Usage var service = await MyService.CreateAsync();

Nested Async Chain Walkthrough

Real Application Architecture

Controller β†’ Service β†’ Repository β†’ Database

// Controller Layer public async Task<IActionResult> GetUser(int id) { var user = await _service.GetUserAsync(id); return Ok(user); } // Service Layer public async Task<User> GetUserAsync(int id) { return await _repo.GetUserByIdAsync(id); } // Repository Layer public async Task<User> GetUserByIdAsync(int id) { return await _db.Users.FirstOrDefaultAsync(u => u.Id == id); }
ℹ️ Key Points: No blocking, exceptions propagate up, threads never wait.

Exception Handling: Basics

try/catch Works Normally

Exceptions from awaited tasks are caught by the resuming thread's catch block.

public async Task ProcessAsync() { try { var data = await FetchDataAsync(); // If throws, caught below await ProcessDataAsync(data); } catch (HttpRequestException ex) { _logger.LogError(ex, "HTTP request failed"); throw; // Re-throw to propagate } }
πŸ’‘ Pattern: Always await or explicitly handle exceptions inside the async method.

Exception Handling: Multiple Tasks

What Happens with Task.WhenAll?

If any task fails, Task.WhenAll throws an AggregateException containing all inner exceptions.

try { var tasks = new[] { Task1Async(), Task2Async(), Task3Async() }; await Task.WhenAll(tasks); } catch (Exception ex) // First exception { _logger.LogError(ex, "At least one task failed"); }
ℹ️ Note: When you await Task.WhenAll, only the first exception is thrown. Access task.Exception to see all exceptions.

CancellationToken: The Why

Rationale

Allows external code (user cancellation, server timeout) to stop long-running work early. Saves CPU/DB resources.

Use Cases

β€’ User closes browser tab or navigates away

β€’ Request timeout (HttpClient timeout)

β€’ Application shutdown (hosted service stopping)

β€’ Batch operation timeout

πŸ’‘ Hint: Use CancellationToken.ThrowIfCancellationRequested() in loops to check cancellation frequently.

CancellationToken: Implementation

How to Pass the Token Down the Stack

// Controller [HttpGet] public async Task<IActionResult> GetUser(int id, CancellationToken ct) { var user = await _service.GetUserAsync(id, ct); return Ok(user); } // Service public async Task<User> GetUserAsync(int id, CancellationToken ct) { var response = await httpClient.GetAsync(url, ct); return await response.Content.ReadAsAsync<User>(ct); }
ℹ️ ASP.NET Core automatically provides a CancellationToken for requests that's triggered when client disconnects.

async void: The Event Handler Exception

The Rule

ONLY use async void for event handlers (UI framework requirement). MUST include error handling inside the method body.

// βœ… OK - Event handler (async void required) private async void OnButtonClick(object sender, EventArgs e) { try { await ProcessDataAsync(); } catch (Exception ex) { _logger.LogError(ex, "Button click failed"); MessageBox.Show($"Error: {ex.Message}"); } }
⚠️ Warning: In async void, exceptions are thrown on the SynchronizationContext and can crash the app if unhandled.
🎯 Why async void is dangerous: With Task-returning methods, exceptions are captured in the Task. With async void, there's no Task to capture exceptionsβ€”they're rethrown on the SynchronizationContext, potentially crashing your app.

Why Event Handlers Must Be Void

Event handler delegates are defined as void (not Task). The framework doesn't await event handlers, so they must be fire-and-forget by design. This is why async void exists.

πŸ“˜ Async/Await Best Practices - Stephen Cleary

Synchronous Code Calling Async Code

A Necessary Evil Sometimes

Sometimes you're forced to call async from sync (e.g., a synchronous interface).

// ❌ BAD (but sometimes necessary) public void SyncMethod() { var result = MyAsyncMethod().GetAwaiter().GetResult(); } // βœ… BETTER (for console apps, not UI/ASP.NET) public void SyncMethod() { Task.Run(() => MyAsyncMethod()).GetAwaiter().GetResult(); }
⚠️ Best Practice: Redesign to avoid this entirely. If unavoidable, understand the deadlock risks for your platform.

Task.WhenAll: Parallel I/O

How to Start Multiple I/O Operations Concurrently

// Fetch 3 users concurrently (parallel I/O) var user1Task = GetUserAsync(1); var user2Task = GetUserAsync(2); var user3Task = GetUserAsync(3); // Wait for ALL to complete var users = await Task.WhenAll(user1Task, user2Task, user3Task); // users is User[] with 3 results
βœ… Result: All 3 requests happen in parallel. Total time = slowest request, not sum of all requests.
🎯 Why use WhenAll: Starting all tasks before awaiting them allows them to run concurrently. If you await in a loop, they run sequentially. WhenAll = parallel I/O without blocking threads.

πŸ“˜ Task.WhenAll Method

Task.WhenAny: Race Conditions & Timeouts

Wait for the First Task to Complete

// Timeout pattern with Task.WhenAny var delayTask = Task.Delay(TimeSpan.FromSeconds(5)); var requestTask = httpClient.GetAsync(url); var first = await Task.WhenAny(requestTask, delayTask); if (first == delayTask) throw new TimeoutException("Request timed out"); return await requestTask;
πŸ’‘ Use Cases: Timeouts, redundant API calls (first-one-wins), race conditions.
🎯 Why use WhenAny: Allows you to implement timeout patterns, try multiple sources and use whichever responds first, or implement cancellation with Task.Delay as a backup.

πŸ“˜ Task.WhenAny Method

Task.Run: The CPU-Bound Tool

When to Use It

Offload heavy, synchronous computation from the current thread to the thread pool.

// βœ… CORRECT - CPU-bound work public async Task<byte[]> ProcessImageAsync(byte[] imageData) { // Run heavy computation on thread pool thread var result = await Task.Run(() => ApplyFilters(imageData)); return result; }
🎯 Why only for CPU-bound: Task.Run queues work to the ThreadPool. For I/O operations, this wastes a thread that just sits idle waiting. I/O has its own async mechanisms (IOCP) that don't need threads.
⚠️ Rule: Use Task.Run only for CPU-bound work that would otherwise block the current thread.

When NOT to Use It

Alternative for UI responsiveness: In WPF/WinForms, use Task.Run to keep UI thread responsive during heavy computation. In ASP.NET Core, usually not needed.

πŸ“˜ Task.Run Method

Task.Run Anti-Patterns

Anti-Pattern 1: Wrapping I/O

// ❌ WRONG - Wastes thread pool thread on I/O var data = await Task.Run(() => httpClient.GetAsync(url)); // βœ… CORRECT - Use native async var data = await httpClient.GetAsync(url);

Anti-Pattern 2: Wrapping .Result

// ❌ WRONG - Defeats the purpose var user = await Task.Run(() => GetUserAsync(id).Result); // βœ… CORRECT - Just await directly var user = await GetUserAsync(id);

Parallelism APIs: Parallel.For & ForEach

When to Use: CPU-Bound Batch Processing

// Process 10,000 items in parallel var items = GetItems(); // 10,000 items var results = new ConcurrentBag<Result>(); Parallel.ForEach(items, item => { var result = ExpensiveComputation(item); results.Add(result); // Thread-safe collection });
⚠️ Important: Parallel.ForEach blocks the calling thread. Not suitable for async I/O.
🎯 Why not for async I/O: Parallel.ForEach expects synchronous work. It blocks threads waiting for each iteration. For async I/O, use Task.WhenAll with Select instead - it starts all operations concurrently without blocking threads.

Alternative for Async I/O

// βœ… CORRECT for async I/O: Use Task.WhenAll var tasks = urls.Select(url => httpClient.GetAsync(url)); await Task.WhenAll(tasks); // Parallel I/O, no blocking // With concurrency limit using SemaphoreSlim await Task.WhenAll(urls.Select(async url => { await semaphore.WaitAsync(); try { return await httpClient.GetAsync(url); } finally { semaphore.Release(); } }));

πŸ“˜ Task Parallel Library (TPL)

PLINQ: Parallel LINQ

How to Use .AsParallel()

// Process data in parallel using LINQ var results = items .AsParallel() .Where(item => item.IsValid) .Select(item => ExpensiveComputation(item)) .ToList();
ℹ️ Benefit: Declarative, easy to use, automatically distributes work across cores.

Use For

β€’ CPU-intensive LINQ queries

β€’ Data transformation pipelines

β€’ Batch processing

Controlling Concurrency: SemaphoreSlim

Rationale

Limiting the maximum number of concurrent operations (e.g., API calls, DB connections).

The Problem

Firing off 1000 concurrent HTTP requests exhausts connection pools, memory, or hits rate limits.

πŸ’‘ Protects: Downstream resources from overload by limiting concurrency.

SemaphoreSlim.WaitAsync

The Correct, Non-Blocking Way

private static SemaphoreSlim _semaphore = new(3); // Max 3 concurrent public async Task<Response> FetchAsync(string url) { await _semaphore.WaitAsync(); // Acquire permit try { return await httpClient.GetAsync(url); } finally { _semaphore.Release(); // MUST release in finally } }
⚠️ Important: Use WaitAsync() (async), NOT .Wait() (blocks).

IAsyncEnumerable: Streaming Data

The Problem with Task<List<T>>

Buffers ALL results in memory before returning. Wasteful for large datasets.

The Solution: IAsyncEnumerable<T>

// βœ… Streams results, no buffering public async IAsyncEnumerable<User> GetAllUsersAsync() { for (int i = 1; i <= 10000; i++) { var user = await FetchUserAsync(i); yield return user; // One at a time } }
πŸ’‘ Benefit: Memory-efficient for large or streaming results.

Using IAsyncEnumerable

Consumer Side: await foreach

// Consume with await foreach await foreach (var user in GetAllUsersAsync()) { Console.WriteLine(user.Name); // Processes as they arrive await _db.SaveAsync(user); }

ASP.NET Core Use

Return IAsyncEnumerable<T> from controllers to stream results directly to the client.

[HttpGet] public async IAsyncEnumerable<User> StreamUsers() { await foreach (var user in _repo.GetAllUsersAsync()) { yield return user; } }

ASP.NET Core vs. UI Context Summary

Recap: Context Behavior

Server (ASP.NET Core) UI (WPF/WinForms)
SyncContext null (none) UI thread context
Deadlock Risk Low (no context) High (single thread)
ConfigureAwait Optional (good practice) Essential in libraries
Async Benefits Scalability (handle more requests) Responsiveness (UI doesn't freeze)

Testing Async Code

Best Practices

Always test the Task (await the SUT call in the test method).

public class UserServiceTests { [Fact] public async Task GetUser_ReturnsUser_WhenExists() { // Arrange var service = new UserService(_mockRepo.Object); // Act var user = await service.GetUserAsync(1); // Assert Assert.NotNull(user); } [Fact] public async Task GetUser_ThrowsException_WhenNotFound() { await Assert.ThrowsAsync<NotFoundException>( () => service.GetUserAsync(999) ); } }

The Best Practices Checklist (Review)

7 Golden Rules

βœ… 1. Return Task/Task<T>: Never async void (except events)

βœ… 2. I/O vs CPU: Async for I/O, Parallel for CPU

βœ… 3. No Blocking: Never .Result or .Wait()

βœ… 4. ConfigureAwait(false): In library code

βœ… 5. CancellationToken: Pass through async chains

βœ… 6. Handle Exceptions: No fire-and-forget

βœ… 7. Control Concurrency: Use SemaphoreSlim when needed

Learning Resources & Next Steps

πŸ“˜ Essential Microsoft Documentation

β€’ Asynchronous Programming – C#

β€’ Task-based Asynchronous Pattern (TAP)

β€’ ASP.NET Core Performance Best Practices

β€’ EF Core Asynchronous Programming

πŸ“š Community Resources

β€’ Stephen Cleary's Blog - Async/Await expert

β€’ ConfigureAwait FAQ - Stephen Toub

πŸ“š Assignment: Convert a legacy sync component in your codebase to async. Apply the best practices learned today.

Summary & Q&A

Key Takeaway

Async is about SCALABILITY, not speed!

βœ… Use async/await correctly

βœ… Avoid blocking with .Result/.Wait()

βœ… Apply ConfigureAwait(false) in libraries

βœ… Pass CancellationToken through chains

Your applications will scale better!

Questions?

πŸ€– Made by AI β€’ Reviewed by Abubakr Bakhromov

Slide 1