Bug 202405

Summary: Regression: iOS 13.1 MediaStreamTrack.enabled = false kills audio track
Product: WebKit Reporter: alan.ford
Component: WebRTCAssignee: youenn fablet <youennf>
Status: RESOLVED FIXED    
Severity: Normal CC: ben.browitt, betimer, cibernaio, commit-queue, daginge, eric.carlson, ews-watchlist, fippo, firstcontact, glenn, hta, jer.noble, kp, makarand, milen.yordanov, mmalavalli, msach22, philipj, sergio, szymon.witamborski, tommyw, webkit-bug-importer, youennf
Priority: P2 Keywords: InRadar
Version: Other   
Hardware: Unspecified   
OS: Unspecified   
Attachments:
Description Flags
Screenshot
none
Patch
none
Patch for landing
none
Patch none

alan.ford
Reported 2019-10-01 07:58:08 PDT
Created attachment 379902 [details] Screenshot To mute an audio track in WebRTC, you can set "enabled" on the MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe also not in iOS 13), but now if you set "enabled" to false, then "readyState" goes from "live" to "ended" and no audio flows in either direction from then on. This does not fail on video tracks. You can test this with the Safari console on https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see screenshot.
Attachments
Screenshot (203.92 KB, image/png)
2019-10-01 07:58 PDT, alan.ford
no flags
Patch (3.23 KB, patch)
2019-10-03 07:02 PDT, youenn fablet
no flags
Patch for landing (3.23 KB, patch)
2019-10-03 08:24 PDT, youenn fablet
no flags
Patch (7.44 KB, patch)
2019-10-03 09:20 PDT, youenn fablet
no flags
Dag-Inge Aas
Comment 1 2019-10-02 00:00:11 PDT
Can confirm this on my device as well, we've had several reports from users about this. Currently on 13.1, will update to 13.1.2 and report back.
Dag-Inge Aas
Comment 2 2019-10-02 01:08:11 PDT
This bug is not fixed on iOS 13.1.2. I can reproduce the error 100% of the time with the following: 1. Go to https://webrtc.github.io/samples/src/content/peerconnection/pc1/ and initiate a call. 2. Observe that you get sound both ways by listening for feedback. 3. Open the developer console 4. Enter pc1.getSenders().forEach(s => {s.track.kind === "audio" ? s.track.enabled = false : null}) 5. Observe that sound now disappears. Unmuting the track by reversing step 4 does not bring sound back. Furthermore, we are able to reproduce this with remote audio using whereby (previously appear.in). Steps to reproduce: 1. Go to https://whereby.com/daginge1231231234 (p2p) on iOS 13 2. Join using another browser, doesn't matter which one, I've tested Chrome and iOS Safari. 3. Observe that you are getting audio running both ways (usually feedback) 4. Mute audio on iOS 13. Observe that audio now disappears both locally and from remote party 5. Observe that unmuting audio does not fix the problem Only way to work around this issue is to hard refresh the page. This is quite serious, and we've had to disable muting local audio on iOS 13 as a result.
youenn fablet
Comment 3 2019-10-02 01:46:48 PDT
I confirm this issue reproes on a recent iOS13, not older ones. This does not seem to repro on MacOS.
Dag-Inge Aas
Comment 4 2019-10-02 02:09:20 PDT
Workaround as suggested by fippo/philipp: Use replaceTrack(null) to remove the audio track from the PC instead of muting it. Remember to not add the audio track on new peer connections, though I guess this is less of an issue for you SFUers. We'll implement this change now and test it out a bit.
Dag-Inge Aas
Comment 5 2019-10-02 04:54:59 PDT
Word fo warning to others who might try the replaceTrack workaround. I'm seeing some serious delays post unmuting in certain cases on Chromium-based WebRTC engines (Safari included). Firefox seems unaffected. So don't go changing all of your muting code to replaceTrack, it might cause some other painful issues, best to keep this to iOS 13 only. FWIW here is my workaround code: const audioTrack = this.mediaStream.stream.getAudioTracks()[0]; const audioSender = this.pc .getSenders() .find( s => (s.track && s.track.kind === "audio") || s.track === null ); audioSender.replaceTrack(action.isMuted ? null : audioTrack); And remember to stop doing track.enabled whereever you might do that in your code.
Philipp Hancke
Comment 6 2019-10-02 09:48:55 PDT
(yay, my old email still works :-) The classic e2e test for this is to disable a track, ensure the remote gets silence, reenable and ensure the remote end hears something again. WPT doesn't have anything which seems pretty lacking. The closest is https://github.com/web-platform-tests/wpt/blob/master/mediacapture-streams/MediaStreamTrack-MediaElement-disabled-audio-is-silence.https.html which shows the analyzer but lacks the "reenable" step and has no peerconnection in between.
youenn fablet
Comment 7 2019-10-02 10:21:33 PDT
There is https://github.com/WebKit/webkit/blob/master/LayoutTests/webrtc/peer-connection-audio-mute2.html which is expected to cover this. I am not sure why this did not catch it. Probably the bug is in real capture code while the tests use mock capture.
Radar WebKit Bug Importer
Comment 8 2019-10-02 13:42:45 PDT
Xin
Comment 9 2019-10-02 19:58:06 PDT
Hi daginge and youenn, I have tried this on iphone 11 (OS 13.1.2), and keep changing the mute and unmute works properly. So I am not sure if this one is related to some models. Xin
Dag-Inge Aas
Comment 10 2019-10-02 23:46:01 PDT
Hi Xin! It works if you do it fast, but if it mutes for anything longer than 2 seconds it will kill all sound. We have reports from iPhone 11 (Pro and regular), X, XS and iPhone 8 with the same issue.
Dag-Inge Aas
Comment 11 2019-10-03 00:05:34 PDT
FWIW we saw some reports from the facebook group for the Association for the Vision Impaired in Norway (botched that translation) that they experienced a similar issue with all sound disappearing after hanging up a call. This was OS level, and it was quite serious for them as they may no longer hear the phone when it rings, or that the sound from the other party may no longer work. So it might be that this issue is on a lower level than Safari.
Szymon Witamborski
Comment 12 2019-10-03 02:27:54 PDT
I've built a very minimal codepen to replicate this issue. It looks like this is not related to RTCPeerConnection at all, all it takes is an audio-only MediaStream connected to an <audio> element. https://codepen.io/brainshave/pen/oNNvgZP Steps: - Click "Start" and accept microphone access - Uncheck "Audio Enabled" - Wait couple of seconds - You should see "local audio track ended" Works every time provided: - screen recording is off - MediaStream is set as srcObject on the audio element Tested with Bluetooth headset and there was no difference. I also had a version of it with a RTCPeerConnection but that proved unnecessary. It does seem like something on the system level because turning on screen recording prevents it from happening. Perhaps some kind of power-saving feature? Tested on iPhone XR and iPhone 8, both iOS 13.1.2. Our app doesn't have any work-arounds in yet for this issue but we do re-request audio access if the track emits "ended" event so in our case the user will see the prompt for accessing the microphone again, which is not ideal but at least it allows them to continue with their call.
youenn fablet
Comment 13 2019-10-03 07:02:02 PDT
Dag-Inge Aas
Comment 14 2019-10-03 07:22:02 PDT
Re: patch. I guess this explains why remote audio would disappear. Autoplay policies are removed when the page is capturing, but this bug will cause capturing to stop, reinstating autoplay policies. Now since the video element wasn't allowed to play due to a user event, the audio output will stop. I love it.
Eric Carlson
Comment 15 2019-10-03 07:39:50 PDT
Comment on attachment 380109 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=380109&action=review r=me once the bots are happy > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:229 > bool MediaStreamPrivate::hasCaptureAudioSource() const Nit: we should probably rename this to something like "hasActiveAudioSource" now that it considers ended and muted states. > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232 > - if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack()) > + if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted()) Should hasCaptureVideoSource also consider the ended and muted states?
youenn fablet
Comment 16 2019-10-03 07:43:33 PDT
> > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232 > > - if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack()) > > + if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted()) > > Should hasCaptureVideoSource also consider the ended and muted states? Agreed, we should also probably make MediaStreamTrack a PlatformMediaSessionClient, that would catch some cases of tracks with no MediaStream. I tried to keep the patch as small as possible and plan to do a follow-up for these changes.
youenn fablet
Comment 17 2019-10-03 08:24:54 PDT
Created attachment 380117 [details] Patch for landing
youenn fablet
Comment 18 2019-10-03 08:54:01 PDT
Actually, I can probably write a test for it.
youenn fablet
Comment 19 2019-10-03 09:20:42 PDT
WebKit Commit Bot
Comment 20 2019-10-03 11:09:39 PDT
Comment on attachment 380121 [details] Patch Clearing flags on attachment: 380121 Committed r250663: <https://trac.webkit.org/changeset/250663>
WebKit Commit Bot
Comment 21 2019-10-03 11:09:42 PDT
All reviewed patches have been landed. Closing bug.
Eric Carlson
Comment 22 2019-11-08 09:43:32 PST
*** Bug 203382 has been marked as a duplicate of this bug. ***
Manik
Comment 23 2019-11-20 09:36:41 PST
Hi Youenn, when can this be expected to be released? Apologies if I'm not reading the status correctly, but I see it as Resolved Fixed meaning it's still waiting for a QA cycle to be verified. Is that correct?
Manjesh Malavalli
Comment 24 2020-01-06 17:49:41 PST
Hi Youenn, I think this bug is still present in i0S 13.3. I ran the JSFiddle in https://bugs.webkit.org/show_bug.cgi?id=203382 and was able to reproduce the observed behavior. A comment in that bug said that the fix was released in 13.2.2. Can you please clarify? - Manjesh
cibernaio
Comment 25 2020-01-28 07:10:17 PST
Hi, Any news about this fix? Thanks. Bye(In reply to alan.ford from comment #0) > Created attachment 379902 [details] > Screenshot > > To mute an audio track in WebRTC, you can set "enabled" on the > MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe > also not in iOS 13), but now if you set "enabled" to false, then > "readyState" goes from "live" to "ended" and no audio flows in either > direction from then on. > > This does not fail on video tracks. > > You can test this with the Safari console on > https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see > screenshot.
Keyur Patel
Comment 26 2020-03-30 12:47:49 PDT
Any update on the status of this bug? Looks like this is still reproducible on 13.2 and 13.3?
Jay Charles
Comment 27 2020-04-07 10:22:40 PDT
Can confirm this is still an issue on safari / ios 13.4 As previous comments suggest, the problem only surfaces if the remote MediaStreamTrack is attached to a video or audio element. If I attach only the video track to the video element and send the audio track to an AudioContext, all works as expected.
youenn fablet
Comment 28 2020-04-07 12:17:59 PDT
(In reply to Jay Charles from comment #27) > Can confirm this is still an issue on safari / ios 13.4 > > As previous comments suggest, the problem only surfaces if the remote > MediaStreamTrack is attached to a video or audio element. If I attach only > the video track to the video element and send the audio track to an > AudioContext, all works as expected. Hi Jay, This might not be the same bug as this one should be fixed in iOS 13.4. Since you can get data from AudioContext, the issue might be either in the audio renderer or in autoplay policies.
Note You need to log in before you can comment on or make changes to this bug.