4032

Blog

Build logs, debugging notes, and shipping stories.

Debug diary

SwiftUI Speech Recognition iOS 26 Fix: Live Transcripts and Mic State Updates

12/7/2025

Debug diary of restoring live transcription on iOS 26 by letting @Published drive SwiftUI, keeping state on the main actor, and ignoring empty partials so the mic and transcript stay in sync.

SwiftUI Speech Recognition iOS 26 Fix: Live Transcripts and Mic State Updates

We wanted a simple flow:

  • Tap a mic, it turns yellow while recording (blue when idle).
  • Text streams live and stays visible after stopping.
  • Works on device (iOS 26), with a safe legacy fallback.

Here’s how we diagnosed “no transcript + blue mic” and fixed it.

TL;DR fixes

  1. Let @Published drive SwiftUI (remove custom objectWillChange).
  2. Make the transcriber @MainActor so state changes reach the UI.
  3. Force a monochrome mic icon with explicit colors.
  4. Ignore empty partials so the transcript doesn’t get wiped.
  5. Keep the legacy SFSpeechRecognizer path on while we harden the iOS 26 path.

The bug: UI never updated

We were publishing manually and blocked SwiftUI updates. Result: mic stayed blue and transcript never appeared, even though logs showed partials.

Before (problematic)

final class SpeechTranscriber: NSObject, ObservableObject {
    let objectWillChange = ObservableObjectPublisher() // blocked @Published
    @Published var transcript: String = ""
    @Published var isRecording: Bool = false
}

After (let SwiftUI do its job)

@MainActor
final class SpeechTranscriber: NSObject, ObservableObject {
    @Published var transcript: String = ""
    @Published var isRecording: Bool = false
    @Published var status: String = "Ready"
}

Making the mic actually yellow

Tab/item accents can override default symbol coloring. We forced monochrome + explicit color.

Button(action: transcriber.toggleTranscription) {
    Image(systemName: transcriber.isRecording ? "stop.circle.fill" : "mic.circle.fill")
        .font(.system(size: 64))
        .symbolRenderingMode(.monochrome)
        .foregroundColor(transcriber.isRecording ? .yellow : .blue)
}

Showing the transcript (and keeping it)

SFSpeechRecognizer delivers partials. We:

  • Ignored empty updates so we don’t clear the UI.
  • Left the last good transcript in place after stopping.
recognitionTask = recognizer.recognitionTask(with: request) { [weak self] result, error in
    guard let self else { return }
    if let result {
        let text = result.bestTranscription.formattedString
        if text.isEmpty { return }              // don’t blank the UI
        DispatchQueue.main.async {
            self.transcript = text              // stays visible
            self.status = self.isRecording ? "Listening…" : self.status
        }
    }
    if let error { /* handle */ }
    else if result?.isFinal == true { self.stopRecording() }
}

Keeping state on the main thread

We set isRecording on the main actor right after the audio engine starts/stops so the button color flips immediately.

try audioEngine.start()
await MainActor.run {
    self.isRecording = true
    self.status = "Listening…"
}

// stop
cleanupRecognition()
DispatchQueue.main.async {
    self.isRecording = false
    if self.status.hasPrefix("Listening") { self.status = "Stopped" }
}

Logging to see the state flow

We added logs on:

  • Permission flow (speech + mic).
  • Start/stop, which path is used (legacy vs modern).
  • Partial/final transcripts.
  • View-level onChange for isRecording and transcript so we know the UI receives updates.

Example UI log hooks:

.onChange(of: transcriber.isRecording) { isRecording in
    print("TranscribeView: isRecording=\(isRecording)")
}
.onChange(of: transcriber.transcript) { text in
    print("TranscribeView: transcriptVisible=\(!text.isEmpty), preview=\"\(text.prefix(64))\"")
}

Current behavior (legacy path active)

  • Tap mic: requests permissions, starts AVAudioEngine + SFSpeechRecognizer.
  • Mic turns yellow while recording, blue after stop.
  • Transcript streams live and stays visible.
  • Status: “Listening…” while active, “Stopped” on stop.

What’s next (optional)

  • Re-enable the iOS 26 on-device path once we finish format negotiation and locale/model handling.
  • Record audio to a file alongside transcription for debugging/sharing.
  • Add “Open Settings” CTA when permissions are denied.
  • Handle interruptions/route changes (calls, AirPods, background).

Ready when you are

Start a log. Let the assistant carry the structure.

Drop your next workout, schedule, or stray thought. 4032 will place it, buffer it, and keep you honest with insights that actually matter.

No complex setup. Bring your calendar and training log; leave with clarity.