You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/concept-face-liveness-detection.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -73,7 +73,7 @@ The liveness detection API returns a JSON object with the following information:
73
73
- A quality filtered "session-image" that can be used to store for auditing purposes or for human review or to perform further analysis using the Face service APIs.
74
74
75
75
76
-
###Data privacy
76
+
## Data privacy
77
77
78
78
We do not store any images or videos from the Face Liveness Check. No image/video data is stored in the liveness service after the liveness session has been concluded. Moreover, the image/video uploaded during the liveness check is only used to perform the liveness classification to determine if the user is real or a spoof (and optionally to perform a match against a reference image in the liveness-with-verify-scenario), and it cannot be viewed by any human and will not be used for any AI model improvements.
In this tutorial, you learn how to detect liveness in faces, using a combination of server-side code and a client-side mobile application. For general information about face liveness detection, see the [conceptual guide](../concept-face-liveness-detection.md).
This tutorial demonstrates how to operate a frontend application and an app server to perform liveness detection, including the optional step of [face verification](#perform-liveness-detection-with-face-verification), across various language SDKs.
20
20
21
-
This tutorial demonstrates how to operate a frontend application and an app server to perform [liveness detection](#perform-liveness-detection), including the optional step of [face verification](#perform-liveness-detection-with-face-verification), across various language SDKs.
> After you complete the prerequisites, you can get started faster by building and running a complete frontend sample (either on iOS, Android, or Web) from the [SDK samples folder](https://github.com/Azure-Samples/azure-ai-vision-sdk/tree/main/samples).
@@ -35,7 +35,7 @@ This tutorial demonstrates how to operate a frontend application and an app serv
35
35
36
36
## Prepare SDKs
37
37
38
-
We provide SDKs in different languages to simplify development on frontend applications and app servers.
38
+
We provide SDKs in different languages to simplify development on frontend applications and app servers:
39
39
40
40
### Download SDK for frontend application
41
41
@@ -396,12 +396,11 @@ The high-level steps involved in liveness orchestration are illustrated below:
396
396
## Perform liveness detection with face verification
:::imagetype="content"source="../media/liveness/liveness-verify-diagram.jpg"alt-text="Diagram of the liveness-with-face-verification workflow of Azure AI Face."lightbox="../media/liveness/liveness-verify-diagram.jpg":::
0 commit comments