Ever wondered how to build intercative that feel natural and responsive? The kind where users can drag events and see real-timeupdates? SwiftUI’s combination of GeometryReader, alignementGuide, and gesture handling makes this suprisingly elegant. Let’s build from scratch and the understand the magic behind positioning elements on a timeline.

The Challenge: Positioning Events in Time

magine you need to visualize events that occur over time — think of a video timeline, project milestones, or async operations. The core challenge is: How do you position an element at a specific time on a timeline of unknown width?

// We want this to work regardless of screen size
Event(time: 2.5, duration: 10.0) // Should be at 25% from left
Event(time: 7.0, duration: 10.0) // Should be at 70% from left

Let’s see how SwiftUI’s layout system can help us solve this elegantly.

Building the basic Timeline

Let’s start with a simple timeline structure:

struct TimelineView: View {
    var events: [Event]
    var duration: TimeInterval

    var body: some View {
        GeometryReader { proxy in
            ZStack(alignment: .leading) {
                ForEach(events) { event in
                    event.value.frame(width: 30, height: 30)
                        .background {
                            Circle().fill(event.color)
                        }
                        .alignmentGuide(.leading) { _ in
                            let relativeTime = event.time / duration
                            return -(proxy.size.width-30) * CGFloat(relativeTime)
                        }
                }
            }
        }
        .frame(height: 50)
    }
}

The magic happens in that alignementGuide closure. But wait — what’s with that mysterious formula? Let’s break it down.

-(proxy.size.width-30) * CGFloat(relativeTime)

This looks backwards at first glance. Why negative values? Let’s understand it step by step. When you use .alignmentGuide(.leading), you’re telling SwiftUI how to offset a view from the left edge:

  • Positive values → shift left (away from the leading edge)
  • Negative values → shift right (towards the trailing edge)

We subtract the circle’s width (30px) so the last event doesn’t overflow off-screen. Without this, a circle positioned at relativeTime = 1.0 would have its center at the edge, making half of it invisible.

Adding Visual Enhancements

A bare timeline isn’t very helpful. Let’s add tick marks and a baseline:

struct TimelineView: View {
    var events: [Event]
    var duration: TimeInterval

    var body: some View {
        GeometryReader { proxy in
            ZStack(alignment: .leading) {
                // Baseline
                Rectangle()
                    .fill(Color.secondary)
                    .frame(height: 1)
                
                // Tick marks
                ForEach(0..<Int(duration.rounded(.up))) { tick in
                    Rectangle()
                        .frame(width: 1)
                        .foregroundStyle(.secondary)
                        .allowsHitTesting(false) // Important!
                        .alignmentGuide(.leading) { _ in
                            let relativeTime = CGFloat(tick) / duration
                            return -(proxy.size.width-30) * CGFloat(relativeTime)
                        }
                }
                .offset(x: 15) // Center ticks under circles
                
                // Events
                ForEach(events) { event in
                    event.value.frame(width: 30, height: 30)
                        .background {
                            Circle().fill(event.color)
                        }
                        .alignmentGuide(.leading) { _ in
                            let relativeTime = event.time / duration
                            return -(proxy.size.width-30) * CGFloat(relativeTime)
                        }
                }
            }
        }
        .frame(height: 50)
    }
}

Notice the .allowsHitTesting(false) on the tick marks — this ensures they don’t interfere with user interactions on the events.

Making Events Draggable

Now comes the fun part: making event draggable! We’ll create a separate EventNode viiew to handle the drag logic:

struct EventNode: View {
    @Binding var event: Event
    var secondsPerPoint: CGFloat
    @GestureState private var offset: CGFloat = 0
    
    var body: some View {
        event.value.frame(width: 30, height: 30)
            .background {
                Circle().fill(event.color)
            }
            .offset(x: offset)
            .gesture(gesture)
    }
    
    private var gesture: some Gesture {
        DragGesture()
            .updating($offset) { value, state, _ in
                state = value.translation.width
            }
            .onEnded { value in
                // Convert pixels to time
                event.time += value.translation.width * secondsPerPoint
                event.time = max(0, event.time) // Keep within bounds
            }
    }
}

Instead of passing complex timeline information to each event node, we calculate a simple conversion factor:

// In TimelineView
let secondsPerPoint = duration / (proxy.size.width - 30)

This tells us: “How many seconds does 1 pixel represent?”

Example:

  • 10-second timeline on 300px width
  • secondsPerPoint = 10 / 270 = 0.037 seconds per pixel
  • Drag 50px → 50 × 0.037 = 1.85 seconds later

Integrating with the Timeline

Now we update our TimelineView to use draggable nodes:

struct TimelineView: View {
    @Binding var events: [Event] // Note: now @Binding
    var duration: TimeInterval

    var body: some View {
        GeometryReader { proxy in
            let secondsPerPoint = duration / (proxy.size.width - 30)
            
            ZStack(alignment: .leading) {
                // ... baseline and ticks (same as before) ...
                
                ForEach($events) { $event in
                    EventNode(
                        event: $event,
                        secondsPerPoint: secondsPerPoint
                    )
                    .alignmentGuide(.leading) { _ in
                        let relativeTime = event.time / duration
                        return -(proxy.size.width-30) * CGFloat(relativeTime)
                    }
                }
            }
        }
        .frame(height: 50)
    }
}

The key change: we now use @Binding var events and ForEach($events) to allow modifications.

From Static Events to Live Async Streams

Now here’s where things get really interesting. Our timeline isn’t just a static data, It can visualize live asynchronous operations in real time!

The key insight: convert staic events into async streams, then apply Swift’s AsyncStream algorithms to see how they combine.

extension Array where Element == Event {
    func makeStream(speedFactor: CGFloat = 1.0) async -> AsyncStream<Event> {
        AsyncStream { continuation in
            Task {
                let startTime = Date()
                
                for event in self {
                    // Calculate when this event should emit
                    let delay = event.time / speedFactor
                    try? await Task.sleep(for: .seconds(delay))
                    
                    // Emit the event with updated timestamp
                    let actualTime = Date().timeIntervalSince(startTime) * speedFactor
                    var updatedEvent = event
                    updatedEvent.time = actualTime
                    continuation.yield(updatedEvent)
                }
                
                continuation.finish()
            }
        }
    }
}

This transforms our static timeline data into live streams that emit events at the right times. The speedFactor lets us speed up slow operations for demo purposes.

Visualizing AsyncAlgorithms Operations

Now we can apply Swift’s AsyncAlgorithms and watch the results appear in real time:

func run(algorithm: Algorithm, _ events1: [Event], _ events2: [Event]) async -> [Event] {
    let speedFactor: CGFloat = 10 // 10x speed for demos
    
    let stream1 = await events1.makeStream(speedFactor: speedFactor)
    let stream2 = await events2.makeStream(speedFactor: speedFactor)
    
    var result = [Event]()
    let startDate = Date()
    
    // Track elapsed time for output events
    var interval: TimeInterval { 
        Date().timeIntervalSince(startDate) * speedFactor 
    }
    
    switch algorithm {
    case .merge:
        // Merge streams: output events as soon as they arrive
        let merged = merge(stream1, stream2)
        return await Array(merged)
        
    case .zip:
        // Zip streams: wait for pairs, then output combined events
        for await (e1, e2) in zip(stream1, stream2) {
            result.append(
                Event(
                    id: .pair(e1.id, e2.id),
                    time: interval,
                    color: .blue,
                    value: .pair(e1.value, e2.value)
                )
            )
        }
        return result
        
    case .combineLatest:
        // Combine latest: output whenever either stream emits
        for await (e1, e2) in combineLatest(stream1, stream2) {
            result.append(
                Event(
                    id: .pair(e1.id, e2.id),
                    time: interval,
                    color: .purple,
                    value: .pair(e1.value, e2.value)
                )
            )
        }
        return result
    }
}

The beautiful part: users can drag events in the input timelines, and watch how it affects the algorithm’s output in real time.

Conclusion

Building interactive timelines in SwiftUI involves several key concepts:

  1. GeometryReader for responsive sizing
  2. alignmentGuide for precise positioning
  3. Drag gestures with @GestureState for smooth interactions
  4. Conversion factors for translating pixels to meaningful units
  5. Debounced updates for performance

The result is a timeline that feels natural to use and adapts to any screen size. The positioning formula might look mysterious at first, but once you understand SwiftUI’s coordinate system, it becomes an elegant solution to a complex layout problem. What will you build with your new timeline powers?


Try It Yourself

Want to experiment with the complete implementation? The full source code is available on GitHub, including all the algorithms, timeline components, and interactive features we’ve discussed.

📦 Get the Complete Project Repository: Interactive AsyncAlgorithms Timeline

This article was developed in collaboration with the excellent team at objc.io as part of their async algorithms timeline visualization project.