Skip to content

Chrome 71 will require user gesture for AudioContext to work #40

@aszmyd

Description

@aszmyd

We've been using AudioContext object to analyze media stream. Chrome plans to change audio policies December 2018 and they'll require user gesture for AudioContext to work:

https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#webaudio

Key Point: If an AudioContext is created prior to the document receiving a user gesture, it will be created in the "suspended" state, and you will need to call resume() after a user gesture is received.

It means, that even if user granted access to mic for given website, audio analysis wont work if i.e. user fired the page and mic is being captured.

Do you have an alternative way to detect current mic level indication?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions