forked from w3c/sensors
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.bs
2469 lines (2023 loc) · 105 KB
/
index.bs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<pre class="metadata">
Title: Generic Sensor API
Shortname: generic-sensor
Level: none
Status: ED
Group: dap
ED: https://w3c.github.io/sensors/
TR: https://www.w3.org/TR/generic-sensor/
Editor: Rick Waldron 50572, Invited Expert, formerly on behalf of Bocoup and JS Foundation
Former Editor: Mikhail Pozdnyakov 78325, Intel Corporation, https://intel.com/
Former Editor: Alexander Shalamov 78335, Intel Corporation, https://intel.com/
Former Editor: Tobie Langel 60809, Codespeaks, formerly on behalf of Intel Corporation, https://www.codespeaks.com/, [email protected]
Abstract:
This specification defines a framework for exposing sensor data
to the Open Web Platform in a consistent way.
It does so by defining a blueprint for writing
specifications of concrete sensors along with an abstract Sensor interface
that can be extended to accommodate different sensor types.
!Other: <a href="https://github.com/web-platform-tests/wpt/tree/master/generic-sensor">Test suite</a>, <a href="https://github.com/w3c/sensors/commits/main/index.bs">latest version history</a>, <a href="https://github.com/w3c/sensors/commits/gh-pages/index.bs">previous version history</a>
Indent: 2
Repository: w3c/sensors
Markup Shorthands: markdown on
Include MDN Panels: if possible
Implementation Report: https://www.w3.org/wiki/DAS/Implementations
Inline GitHub Issues: yes
Default Biblio Status: current
Status Text:
Further implementation experience is being gathered for the
[=permission request algorithm=] and specification clarifications
informed by this experience are being discussed in
<a href="https://github.com/w3c/sensors/issues/397">GitHub issue #397</a>.
The group does not expect to advance this specification beyond CR until
[[PERMISSIONS-REQUEST]] moves beyond incubation.
This document is maintained and updated at any time. Some parts of this document are work in progress.
</pre>
<pre class="anchors">
urlPrefix: https://html.spec.whatwg.org/multipage/; spec: HTML
type: dfn
urlPrefix: webappapis.html
text: task queue
text: spin the event loop; url: spin-the-event-loop
urlPrefix: interaction.html
text: DOM anchor; url: dom-anchor
text: gains focus; url: gains-focus
text: currently focused area; url: currently-focused-area-of-a-top-level-traversable
urlPrefix: https://w3ctag.github.io/security-questionnaire/; spec: SECURITY-PRIVACY-QUESTIONNAIRE
type: dfn
text: same-origin policy violations; url: sop-violations
urlPrefix: https://w3c.github.io/webdriver/; spec: WEBDRIVER2
type: dfn
text: current browsing context; url: dfn-current-browsing-context
text: WebDriver error code; url: dfn-error-code
text: local end; url: dfn-local-ends
text: url variable; url: dfn-url-variables
text: get a property; url: dfn-getting-properties
text: get a property with default; url: dfn-getting-the-property-with-default
text: set a property; url: dfn-set-a-property
urlPrefix: https://tc39.github.io/ecma262/; spec: ECMAScript
type: dfn
text: current realm; url: current-realm
</pre>
<pre class=link-defaults>
spec: dom; type:dfn; text:event
spec: webidl; type:dfn; text:attribute
spec: webidl; type:dfn; text:dictionary member
spec: webidl; type:dfn; text:identifier
spec: webdriver2; type:dfn; text:error
spec: webdriver2; type:dfn; text:session
</pre>
<pre class=biblio>
{
"ACCELPRINT": {
"authors": [
"Dey, Sanorita, et al."
],
"id": "ACCELPRINT",
"href": "http://synrg.csl.illinois.edu/papers/AccelPrint_NDSS14.pdf",
"title": "AccelPrint: Imperfections of Accelerometers Make Smartphones Trackable",
"date": "2014",
"status": "Informational",
"publisher": "Network and Distributed System Security Symposium (NDSS)"
},
"MOBILESENSORS": {
"authors": [
"Manish J. Gajjar"
],
"id": "MOBILESENSORS",
"title": "Mobile Sensors and Context-Aware Computing",
"date": "2017",
"status": "Informational",
"publisher": "Morgan Kaufmann"
},
"GYROSPEECHRECOGNITION": {
"authors": [
"Michalevsky, Y., Boneh, D. and Nakibly, G."
],
"id": "GYROSPEECHRECOGNITION",
"href": "https://www.usenix.org/system/files/conference/usenixsecurity14/sec14-paper-michalevsky.pdf",
"title": "Gyrophone: Recognizing Speech from Gyroscope Signals",
"date": "2014",
"status": "Informational",
"publisher": "USENIX"
},
"STEALINGPINSVIASENSORS": {
"authors": [
"Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao"
],
"id": "STEALINGPINSVIASENSORS",
"href": "https://rd.springer.com/article/10.1007/s10207-017-0369-x?wt_mc=Internal.Event.1.SEM.ArticleAuthorOnlineFirst",
"title": "Stealing PINs via mobile sensors: actual risk versus user perception",
"date": "2017",
"status": "Informational",
"publisher": "International Journal of Information Security"
},
"GENERIC-SENSOR-USECASES": {
"authors": [
"Rick Waldron, Mikhail Pozdnyakov, Alexander Shalamov"
],
"id": "GENERIC-SENSOR-USECASES",
"href": "https://w3c.github.io/sensors/usecases",
"title": "Sensor Use Cases",
"date": "2017",
"status": "Note"
},
"COORDINATES-TRANSFORMATION": {
"authors": [
"George W. Collins, II"
],
"id": "COORDINATES-TRANSFORMATION",
"href": "http://ads.harvard.edu/books/1989fcm..book/Chapter2.pdf",
"title": "The Foundations of Celestial Mechanics",
"date": "2004",
"status": "Informational",
"publisher": "Pachart Foundation dba Pachart Publishing House"
}
}
</pre>
<h2 id="intro">Introduction</h2>
Increasingly, sensor data is used in application development to
enable new use cases such as geolocation,
counting steps or head-tracking.
This is especially true on mobile devices where new sensors are added regularly.
Exposing sensor data to the Web
has so far been both slow-paced and ad-hoc.
Few sensors are already exposed to the Web.
When they are, it is often in ways that limit their possible use cases
(for example by exposing abstractions that are too [=high-level=]
and which don't perform well enough).
APIs also vary greatly from one sensor to the next
which increases the cognitive burden of Web application developers
and slows development.
The goal of the Generic Sensor API is to
promote consistency across sensor APIs,
enable advanced use cases thanks to performant [=low-level=] APIs,
and increase the pace at which new sensors can be exposed to the Web
by simplifying the specification and implementation processes.
A comprehensive list of concrete sensors that are based on Generic Sensor API, applicable use
cases, and code examples can be found in [[GENERIC-SENSOR-USECASES]] and [[MOTION-SENSORS]]
explainer documents.
<h2 id="scope">Scope</h2>
<em>This section is non-normative</em>.
The scope of this specification is currently limited
to specifying primitives which enable exposing data from [=device sensors=].
Exposing remote sensors
or sensors found on personal area networks (e.g. Bluetooth)
is out of scope.
As work in these areas mature,
it is possible that common, lower-level primitives be found,
in which case this specification will be updated accordingly.
This should have little to no effects on implementations, however.
This specification also does not currently expose a
sensor discovery API.
This is because the limited number of sensors currently available to user agents
does not warrant such an API.
Using feature detection, such as described in [[#feature-detection]],
is good enough for now.
A subsequent version of this specification might specify such an API,
and the current API has been designed with this in mind.
<h2 id="feature-detection">A note on Feature Detection of Hardware Features</h2>
<em>This section is non-normative.</em>
Feature detection is an established Web development best practice.
Resources on the topic are plentiful on and offline and
the purpose of this section is not to discuss it further,
but rather to put it in the context of detecting hardware-dependent features.
Consider the below feature detection examples:
<div class="example">
This simple example illustrates how to check whether the User Agent
exposes an interface for a particular sensor type. To handle errors in a robust
manner, please refer to [this example](#robust-example).
<pre highlight="js">
if (typeof Gyroscope === "function") {
// run in circles...
}
if ("ProximitySensor" in window) {
// watch out!
}
if (window.AmbientLightSensor) {
// go dark...
}
// etc.
</pre>
</div>
All of these tell you something about the presence
and possible characteristics of an API.
They do not tell you anything, however, about whether
that API is actually connected to a real hardware sensor,
whether that sensor works,
if its still connected,
or even whether the user is going to allow you to access it.
Note you can check the latter using the Permissions API [[PERMISSIONS]].
In an ideal world, information about the underlying status
would be available upfront.
The problem with this is twofold.
First, getting this information out of the hardware is costly,
in both performance and battery time,
and would sit in the critical path.
Secondly, the status of the underlying hardware can evolve over time.
The user can revoke permission, the connection to the sensor be severed,
the operating system may decide to limit sensor usage below a certain battery threshold,
etc.
Therefore, an effective strategy is to combine feature detection,
which checks whether an API for the sought-after sensor actually exists,
and defensive programming which includes:
1. checking for error thrown when instantiating a {{Sensor}} object,
2. listening to errors emitted by it,
3. handling all of the above graciously so that the user's experience is
enhanced by the possible usage of a sensor, not degraded by its
absence.
<div class="example" id="robust-example">
The following snippet illustrates how an Accelerometer sensor can be created
in a robust manner. Web application may choose different options for error
handling, for example, show notification, choose different sensor type or
fallback to other API.
<pre highlight="js">
let accelerometer = null;
try {
accelerometer = new Accelerometer({ frequency: 10 });
accelerometer.addEventListener('error', event => {
// Handle runtime errors.
if (event.error.name === 'NotAllowedError') {
console.log('Permission to access sensor was denied.');
} else if (event.error.name === 'NotReadableError' ) {
console.log('Cannot connect to the sensor.');
}
});
accelerometer.addEventListener('reading', () => reloadOnShake(accelerometer));
accelerometer.start();
} catch (error) {
// Handle construction errors.
if (error.name === 'SecurityError') {
console.log('Sensor construction was blocked by the Permissions Policy.');
} else if (error.name === 'ReferenceError') {
console.log('Sensor is not supported by the User Agent.');
} else {
throw error;
}
}
</pre>
</div>
<h2 id="security-and-privacy">Security and privacy considerations</h2>
<div class="note">
The judgement on how to communicate to the user the known [[#main-privacy-security-threats|threats]]
is up to the implementer. The implementation of the [[#mitigation-strategies|mitigations]] is
mandatory, however.
</div>
[=sensor readings|Sensor readings=] are sensitive data and could become a subject of
various attacks from malicious Web pages. Before discussing the mitigation strategies we
briefly enumerate the main types of the [=device sensor|sensor=]'s privacy and security threats.
The [[MOBILESENSORS]] categorizes main threats into [[#location-tracking|location tracking]],
[[#eavesdropping|eavesdropping]], [[#keystroke-monitoring|keystroke monitoring]],
[[#device-fingerprinting|device fingerprinting]], and [[#user-identifying|user identification]].
This categorization is a good fit for this specification.
The risk of successful attack can increase when [=device sensor|sensors=] are used with each other,
in combination with other functionality, or when used over time,
specifically with the risk of correlation of data
and user identification through fingerprinting.
Web application developers using these JavaScript APIs should
consider how this information might be correlated with other information
and the privacy risks that might be created.
The potential risks of collection of such data over a longer period of time
should also be considered.
Variations in [=sensor readings=]
as well as event firing rates
offer the possibility of fingerprinting to identify users.
User agents may reduce the risk by
limiting event rates available to web application developers.
Minimizing the accuracy of a sensor's readout
generally decreases the risk of fingerprinting.
User agents should not provide unnecessarily verbose readouts of sensors data.
Each [=sensor type=] should be assessed individually.
If the same JavaScript code using the API can be
used simultaneously in different window contexts on the same device
it may be possible for that code to correlate the user across those two contexts,
creating unanticipated tracking mechanisms.
User agents should consider providing the user
an indication of when the [=device sensor|sensor=] is used
and allowing the user to disable it.
Additionally, user agents may consider
allowing the user to verify past and current sensor use patterns.
Web application developers that use [=device sensor|sensors=] should
perform a privacy impact assessment of their application
taking all aspects of their application into consideration.
Ability to detect a full working set of sensors on a device can form an
identifier and could be used for fingerprinting.
A combination of selected sensors can potentially be used to form an out of
band communication channel between devices.
Sensors can potentially be used in cross-device linking and tracking of a user.
<h3 id="main-privacy-security-threats">Types of privacy and security threats</h3>
<em>This section is non-normative.</em>
<h4 id="location-tracking">Location Tracking</h4>
Under this type of threat, the attacks use [=sensor readings=] to locate the device without
using GPS or any other location sensors. For example, accelerometer data can be used to infer
the location of smartphones by using statistical models to obtain estimated trajectory,
then map matching algorithms can be used to obtain predicted location points (within a
200-m radius)[[MOBILESENSORS]].
<h4 id="eavesdropping">Eavesdropping</h4>
Recovering speech from gyroscope [=sensor readings|readings=] is an example of eavesdropping attack.
See [[GYROSPEECHRECOGNITION]].
<h4 id="keystroke-monitoring">Keystroke Monitoring</h4>
Many user inputs can be inferred from [=sensor readings=], this includes a wide range of attacks
on user PINs, passwords, and lock patterns (and even touch actions such as click, scroll, and
zoom) using motion sensors. These attacks normally train a machine learning algorithm to
discover such information about the users. See [[STEALINGPINSVIASENSORS]].
<h4 id="device-fingerprinting">Device Fingerprinting</h4>
Sensors can provide information that can uniquely identify the device using those sensors.
Every concrete sensor model has minor manufacturing imperfections and differences that will be
unique for this model. These manufacturing variations and imperfections can be used to fingerprint
the device [[ACCELPRINT]] [[MOBILESENSORS]].
<h4 id="user-identifying">User Identifying</h4>
[=Sensor readings=] can be used to identify the user, for example via inferring
individual walking patterns from smartphone or wearable device motion sensors' data.
<h3 id="mitigation-strategies">Mitigation Strategies</h3>
<em>This section is non-normative.</em>
This section gives a high-level presentation of some of the mitigation strategies
specified in the normative sections of this specification.
<h4 id="secure-context">Secure Context</h4>
[=Sensor readings=] are explicitly flagged by the
Secure Contexts specification [[POWERFUL-FEATURES]]
as a high-value target for network attackers.
Thus all interfaces defined by this specification
or [=extension specifications=]
are only available within a [=secure context=].
<h4 id="permissions-policy" oldids="browsing-context,feature-policy">Permissions Policy</h4>
To avoid the privacy risk of sharing [=sensor readings=] with contexts unfamiliar
to the user, [=sensor readings=] are only available for the
[=documents=] which are [=allowed to use=] the [=policy-controlled features=] for
the given [=sensor type=]. See [[PERMISSIONS-POLICY]] for more details.
<h4 id="focused-area" oldids="losing-focus">Focused Area</h4>
[=Sensor readings=] are only available for an [=navigable/active document=] if
the [=focus and origin check=] on it returns true.
This is done in order to mitigate the risk of a skimming attack against the
[=/navigable=] containing an element which has [=gains focus|gained focus=],
for example when the user carries out an in-game purchase using a third party
payment service from within an iframe.
<h4 id="visibility-state">Visibility State</h4>
[=Sensor readings=] are only available for the [=active documents=] whose
[=visibility state=] is "visible".
<h4 id="permissions" oldids="permissioning">Permissions API</h4>
Access to [=sensor readings=] are controlled by the Permissions API [[!PERMISSIONS]].
<h3 id="mitigation-strategies-case-by-case">Mitigation strategies applied on a case by case basis</h3>
Each [=sensor type=] will need to be assessed individually,
taking into account the use cases it enables
and its particular threat profile.
While some of the below mitigation strategies
are effective for certain sensors,
they might also hinder or altogether prevent certain use cases.
Note: These mitigation strategies can be applied constantly or temporarily,
for example when the user is carrying out specific actions,
when other APIs which are known to amplify the level of the threat are in use,
etc.
<h4 id="limit-max-frequency" dfn>Limit maximum sampling frequency</h4>
User agents and [=extension specifications=] may mitigate certain threats by defining a [=sensor
type=]'s [=sensor type/maximum sampling frequency=].
What upper limit to choose depends on the [=sensor type=],
the kind of threats the user agent is trying to protect against,
the expected resources of the attacker, etc.
Limiting the [=sensor type/maximum sampling frequency=] prevents use cases which rely on low
latency or high data density.
<h4 id="stop-sensor" dfn>Stop the sensor altogether</h4>
This is obviously a last-resort solution,
but it can be extremely effective if it's temporal,
for example to prevent password skimming attempts
when the user is entering credentials on a different origin ([[rfc6454]])
or in a different application.
<h4 id="limit-number-of-delivered-readings" dfn>Limit number of delivered readings</h4>
An alternative to [=limit maximum sampling frequency|limiting the maximum sampling frequency=] is to
limit the number of [=sensor readings=] delivered to Web application developer,
regardless of the [=sampling frequency=].
This allows use cases which have low latency requirement
to increase the [=sampling frequency=]
without increasing the amount of data provided.
Discarding intermediary readings prevents certain use cases,
such as those relying on certain kinds of filters.
<h4 id="reduce-accuracy" dfn>Reduce accuracy</h4>
Reducing the accuracy of [=sensor readings=]
or sensor [=reading timestamps=]
might also help mitigate certain threats,
thus user agents should not provide
unnecessarily verbose readouts of sensors data.
Implementations of concrete sensors may define a [=threshold check algorithm=]
so that new readings that do not differ enough from the [=latest readings=] are
discarded.
Implementations of concrete sensors may define a [=reading quantization
algorithm=] to reduce the accuracy of the [=sensor readings=] received from a
[=device sensor=].
Note: These two mitigation measures often complement each other. An
implementation that only executes the [=threshold check algorithm=] might
expose readings that are too precise, while an implementation that only rounds
readings up may provide attackers with information about more precise readings
when raw readings are rounded to different values.
Note: Inaccuracies will further increase for operations carried out on the
[=sensor readings=], or time deltas calculated from the [=reading timestamp|timestamps=].
So, this mitigation strategy can affect certain use cases.
Note: While adding random bias to [=sensor readings=] has similar effects,
it shouldn't be used in practice
as it is easy to filter out the added noise.
<h4 id="inform-user">Keep the user informed about API use</h4>
User agents may choose to keep the user informed
about current and past use of the API.
Note: This does not imply keeping a log of the actual [=sensor readings=]
which would have issues of its own.
<h2 id="concepts">Concepts</h2>
<h3 id="concepts-sensors">Sensors</h3>
The term <dfn id="concept-device-sensor">device sensor</dfn> refers to a device's underlying
physical sensor instance.
A [=device sensor=] measures a physical quantities
and provides a corresponding <dfn export>sensor reading</dfn>
which is a source of information about the environment.
Each [=sensor reading=] is composed of the values
of the physical quantity measured by the [=device sensor=]
at time <var ignore>t<sub>n</sub></var> which is called the <dfn>reading timestamp</dfn>.
If the [=device sensor=] performs a spatial measurement (e.g.
acceleration, angular velocity), it must be resolved in
a <dfn export>local coordinate system</dfn> that represents
a reference frame for the [=device sensor=]'s [=sensor readings=].
A [=device sensor=] that provides such [=sensor readings=]
is referred to as <dfn export>spatial sensor</dfn>.
A [=spatial sensor=] can be uniaxial, biaxial,
or triaxial, depending on the number of orthogonal axes in
which it can perform simultaneous measurements.
Scalar physical quantities (e.g. temperature) do not require
a [=local coordinate system=] for resolution.
The [=local coordinate system=] normally used in a mobile device is
a Cartesian coordinate system, which is defined relative to the
device's screen, so that X and Y axes are parallel to the screen
dimentions and Z axis is perpendicular to the screen surface.
The term <dfn id="concept-platform-sensor">platform sensor</dfn> refers to platform interfaces,
with which the user agent interacts to obtain [=sensor readings=] for a single [=sensor type=]
originated from one or more [=device sensors=].
[=Platform sensor=] can be defined by the underlying platform (e.g. in a native sensors framework)
or by the user agent, if it has a direct access to [=device sensor=].
From the implementation perspective [=platform sensor=] can be treated as a software proxy for the
corresponding [=device sensor=]. It is possible to have multiple [=platform sensors=] simultaneously
interacting with the same [=device sensor=] if the underlying platform suppports it.
In simple cases, a [=platform sensor=] corresponds to a single [=device sensor=],
but if the provided [=sensor readings=] are a product of [=sensor fusion=] performed
in software, the [=platform sensor=] corresponds to a set of [=device sensors=]
involved in the [=sensor fusion=] process.
Discrepancies between a [=sensor reading=]
and the corresponding physical quantity being measured
are corrected through <dfn>calibration</dfn> that can happen at manufacturing time.
Some sensors can require dynamic calibration to compensate unknown discrepancies.
Note: [=Platform sensors=] created through [=sensor fusion=] are sometimes
called virtual or synthetic sensors. However, the specification doesn't
make any practical distinction between them.
<h3 id="concepts-sensor-types">Sensor Types</h3>
Different [=sensor types=] measure different physical quantities
such as temperature, air pressure, heart-rate, or luminosity.
For the purpose of this specification we distinguish between
[=high-level=] and [=low-level=] [=sensor types=].
[=Sensor types=] which are characterized by their implementation
are referred to as <dfn>low-level</dfn> sensors.
For example a Gyroscope is a [=low-level=] [=sensor type=].
Sensors named after their [=sensor readings|readings=],
regardless of the implementation,
are said to be <dfn>high-level</dfn> sensors.
For instance, geolocation sensors provide information about the user's location,
but the precise means by which this data is obtained
is purposefully left opaque
(it could come from a GPS chip, network cell triangulation,
wifi networks, etc. or any combination of the above)
and depends on various, implementation-specific heuristics.
[=High-level=] sensors are generally the fruits of
applying algorithms to [=low-level=] sensors--
for example, a pedometer can be built using only the output of a gyroscope--
or of [=sensor fusion=].
That said, the distinction between
[=high-level=] and [=low-level=] [=sensor types=]
is somewhat arbitrary and the line between the two is often blurred.
For instance, a barometer, which measures air pressure,
would be considered [=low-level=] for most common purposes,
even though it is the product of the [=sensor fusion=] of
resistive piezo-electric pressure and temperature sensors.
Exposing the sensors that compose it would serve no practical purpose;
who cares about the temperature of a piezo-electric sensor?
A pressure-altimeter would probably fall in the same category,
while a nondescript altimeter--
which could get its data from either a barometer or a GPS signal--
would clearly be categorized as a [=high-level=] [=sensor type=].
Because the distinction is somewhat blurry,
extensions to this specification (see [[#extensibility]])
are encouraged to provide domain-specific definitions of
[=high-level=] and [=low-level=] sensors
for the given [=sensor types=] they are targeting.
[=Sensor readings=] from different [=sensor types=] can be combined together
through a process called <dfn>sensor fusion</dfn>.
This process provides [=high-level|higher-level=] or
more accurate data (often at the cost of increased latency).
For example, the [=sensor readings|readings=] of a triaxial magnetometer
needs to be combined with the [=sensor readings|readings=] of an accelerometer
to provide a correct bearing.
Smart sensors and sensor hubs
have built-in compute resources which allow them
to carry out [=calibration=] and [=sensor fusion=] at the hardware level,
freeing up CPU resources and lowering battery consumption in the process.
[=Sensor fusion=] can also be carried out in software if it cannot be
performed at the hardware level or if an application-specific
[=sensor fusion|fusion=] algorithm is required.
## Default sensor ## {#concepts-default-sensor}
The Generic Sensor API is designed to make the most common use cases straightforward
while still enabling more complex use cases.
Most of devices deployed today do not carry more than one
[=device sensor=] providing [=sensor readings=] of the same [=sensor type|type=].
The use cases which require a set of similar [=device sensors=] are rare
and generally limited to specific [=sensor types=],
such as multiple accelerometers in 2-in-1 laptops.
The API therefore makes it easy to interact with
the device's default (and often unique) [=device sensor|sensor=]
for each [=sensor types|type=]
simply by instantiating the corresponding {{Sensor}} subclass.
Indeed, without specific information identifying a particular [=device sensor|sensor=]
of a given [=sensor type|type=], the <dfn export>default sensor</dfn> is chosen by the
user agent.
If the underlying platform provides an interface to find the [=default sensor=],
the user agent must choose the sensor offered by the platform, otherwise the user agent
itself defines which of the [=device sensor|sensors=] present on the device is
the [=default sensor=].
<div class="example">
Listening to the default accelerometer changes:
<pre highlight="js">
let sensor = new Accelerometer({ frequency: 30 });
sensor.onreading = () => { ... }
sensor.start();
</pre>
</div>
Note: Extensions to this specification may choose not to define a [=default sensor=]
when doing so wouldn't make sense.
For example, it does not make sense to explicitly define a default
[=device sensor|sensor=] for geolocation [=sensor type=] as the
implementation of its interface can use multiple backends.
In cases where
multiple [=device sensors=] corresponding to the same [=sensor type|type=]
may coexist on the same device,
specification extension will have to
define ways to uniquely identify each one.
<div class="example">
For example checking the pressure of the left rear tire:
<pre highlight="js">
var sensor = new DirectTirePressureSensor({ position: "rear", side: "left" });
sensor.onreading = _ => console.log(sensor.pressure);
sensor.start();
</pre>
</div>
## Sampling Frequency and Reporting Frequency ## {#concepts-sampling-and-reporting-frequencies}
For the purpose of this specification, a [=platform sensor=]'s <dfn>sampling frequency</dfn> is
defined as a frequency at which a [=platform sensor=] obtains [=sensor readings=] from the
underlying [=device sensor=]. The way such [=sensor readings=] are obtained is
[=implementation-defined=].
The [=platform sensor=]'s [=sampling frequency=] may not correspond to the [=device sensor=]'s
actual sampling rate, which, for the purpose of this specification, is opaque.
Note: System-level APIs for [=sensor readings=] and the underlying hardware interface to the sensors
themselves may be built for polling or events. For a polling-based [=device sensor=], the [=platform
sensor=]'s [=sampling frequency=] would be the rate at which a new reading is requested from the
system or hardware. For an event-based [=device sensor=], a [=platform sensor=] provides a requested
sampling frequency to the system or hardware, and events are generated at that frequency or below.
Events may not be generated if the sensor reading has not changed.
A [=device sensor=] may provide bounds for the sampling frequency value it can accept from a
[=platform sensor=] in the form of a <dfn for="device sensor">minimum sampling frequency</dfn> and a
<dfn for= "device sensor">maximum sampling frequency</dfn>. A [=platform sensor=]'s [=sampling
frequency=] must not be less than the [=device sensor=]'s [=device sensor/minimum sampling
frequency=] or greater than its [=device sensor/maximum sampling frequency=].
A [=platform sensor=]'s [=sampling frequency=] is determined based on the provided
{{Sensor/[[frequency]]}} of the [=set/items=] in its [=ordered set|set=] of [=activated sensor
objects=]. The calculation is [=implementation-defined=], but the outcome value must lie within the
bounds set by the [=platform sensor=]'s [=sensor type=]'s [=sensor type/minimum sampling
frequency|minimum=] and [=sensor type/maximum sampling frequency|maximum=] sampling frequencies and
its [=device sensor=]'s [=device sensor/minimum sampling frequency|minimum=] and [=device
sensor/maximum sampling frequency|maximum=] sampling frequencies.
Note: For example, the user agent may estimate the [=sampling frequency=] as a Least Common
Denominator (LCD) for a set of provided {{Sensor/[[frequency]]}} capped by [=sampling frequency=]
bounds defined by the underlying platform.
The <dfn>reporting frequency</dfn> for a concrete {{Sensor}} object is defined as a frequency at which
the "reading" event is [=fire an event|fired=] at this object.
A {{Sensor}} object cannot access new [=sensor readings|readings=] at a higher rate than the
user agent obtains them from the underlying platform, therefore the [=reporting frequency=] can
never exceed a [=platform sensor=]'s [=sampling frequency=], which in turn can never exceed a
[=device sensor=]'s [=device sensor/maximum sampling frequency=] (when specified).
The [=reporting frequency=] differs from the {{Sensor}}'s {{Sensor/[[frequency]]}} in cases such as:
- the requested {{Sensor/[[frequency]]}} lies outside the bounds returned by invoking [=get a
platform sensor's sampling bounds=] with {{Sensor}}'s associated [=platform sensor=].
- the operating system and/or the [=device sensor=] automatically discard
readings that do not differ enough (in absolute or relative terms) from the
previously reported ones via a hardware or operating system filter.
- the {{Sensor}} instance's associated [=sensor type=]'s [=threshold check
algorithm=] fails and the [=platform sensor=]'s [=latest readings=] are not
updated.
## Conditions to expose sensor readings ## {#concepts-can-expose-sensor-readings}
The user agent <dfn>can expose sensor readings</dfn> to a {{Document}}
|document| if and only if all of the following are true:
- |document|'s [=relevant settings object=] is a [=secure context=].
- |document|'s [=visibility state=] is "visible".
- The [=focus and origin check=] on |document| returns true.
- <dfn export>Specific conditions</dfn>: [=Extension specifications=] may add new
conditions to this list to have stricter requirements for their sensor types.
Note: In addition to the conditions above, it is important to note that {{Sensor}}
subclasses invoke the [=check sensor policy-controlled features=] operation in their
constructors, and [[#sensor-start]] invokes [=request sensor access=]. Together, these
checks correspond to the mitigation strategies described in [[#mitigation-strategies]].
Note: In order to release hardware resources, the user agent can request underlying
[=platform sensor=] to suspend notifications about newly available readings until it
[=can expose sensor readings=].
<h2 id="model">Model</h2>
<h3 id="model-sensor-type">Sensor Type</h3>
A <dfn>sensor type</dfn> must have the following associated data:
- One or more [=extension sensor interfaces=].
- A [=set/is empty|non-empty=] [=ordered set=] of associated [=powerful feature/name|powerful
feature names=] referred to as <dfn export>sensor permission names</dfn>.
Note: Multiple [=sensor types=] may share the same [=powerful feature/name=].
- A [=set/is empty|non-empty=] [=ordered set=] of associated [=policy-controlled feature=] tokens
referred to as <dfn export>sensor feature names</dfn>.
- A [=permission revocation algorithm=].
- A <dfn export for="sensor type">minimum sampling frequency</dfn>, a positive number. It is either
[=implementation-defined=] or defined by an [=extension specification=]. If both are set, the
largest value is used.
- A <dfn export for="sensor type">maximum sampling frequency</dfn>, a positive number. It is either
[=implementation-defined=] or defined by an [=extension specification=]. If both are set, the
smallest value is used.
The [=sensor type/minimum sampling frequency=] must not be greater than the [=sensor type/maximum
sampling frequency=].
A [=sensor type=] may have the following associated data:
- A [=default sensor=].
- A <dfn export>threshold check algorithm</dfn>, which takes as arguments two separate [=sensor
readings=] and determines if they differ enough to cause a [=platform sensor=]'s [=latest
reading=] map to be updated.
- A <dfn export>reading quantization algorithm</dfn>, which takes a [=sensor reading=] and returns a
less accurate [=sensor reading=].
- A [=virtual sensor type=].
<h3 id="model-sensor">Sensor</h3>
The current [=browsing context=]'s [=platform sensor=] must have:
- An associated [=ordered set|set=] of <dfn>activated sensor objects</dfn>,
which is initially [=set/is empty|empty=];
- An associated <dfn>latest reading</dfn> [=ordered map|map=], which holds the
latest available [=sensor readings=].
- An associated [=sensor type=].
Any time a new [=sensor reading=] for a [=platform sensor=] is obtained and if the user agent
[=can expose sensor readings=] to the current [=/navigable=]'s [=navigable/active document=],
the user agent invokes [=update latest reading=] with the [=platform sensor=] and
the [=sensor reading=] as arguments.
The [=latest reading=] [=ordered map|map=] contains an [=map/entry=] whose [=map/key=] is
"timestamp" and whose [=map/value=] is a high resolution timestamp that estimates the
[=reading timestamp=] expressed in milliseconds as an [=monotonic clock/unsafe current time=].
[=Latest reading=]["timestamp"] is initially set to null, unless the [=latest reading=] [=map=]
caches a previous [=sensor readings|reading=].
The other [=map/entries=] of the [=latest reading=] [=ordered map|map=]
hold the values of the different quantities measured by the [=platform sensor=].
The [=map/keys=] of these [=map/entries=] must match
the [=attribute=] [=identifier=] defined by the [=sensor type=]'s
associated [=extension sensor interface=].
The return value of the [=attribute=] getter is
easily obtained by invoking [=get value from latest reading=]
with the object implementing the [=extension sensor interface=]
and the [=attribute=] [=identifier=] as arguments.
The [=map/value=] of all [=latest reading=] [=map/entries=]
is initially set to null.
<!-- ,
unless the [=latest reading=] [=ordered map|map=]
caches a previous [=sensor readings|reading=].
Note: There are additional privacy concerns when using cached [=sensor readings|readings=]
which predate either [=navigating=] to resources in the current [=origin=],
or being granted permission to access the [=platform sensor=]. -->
<div algorithm>
To <dfn>get a platform sensor's sampling bounds</dfn> given a [=platform sensor=]
|platformSensor|:
1. Let |minimumFrequency| be |platformSensor|'s [=sensor type=]'s [=sensor type/minimum sampling
frequency=].
1. If |platformSensor|'s connected [=device sensor=] has a [=device sensor/minimum sampling
frequency=], set |minimumFrequency| to the maximum of |minimumFrequency| and this value.
1. Let |maximumFrequency| be |platformSensor|'s [=sensor type=]'s [=sensor type/maximum sampling
frequency=].
1. If |platformSensor|'s connected [=device sensor=] has a [=device sensor/maximum sampling
frequency=], set |maximumFrequency| to the minimum of |maximumFrequency| and this value.
1. Return a [=tuple=] (|minimumFrequency|, |maximumFrequency|).
</div>
<div class=example>
This example illustrates a possible implementation of the described [[#model|Model]].
In the diagram below several [=activated sensor objects|activated=] {{Sensor}} objects from two
different [=browsing contexts=] interact with a single [=device sensor=].
<img srcset="images/generic_sensor_model.svg" src="images/generic_sensor_model.png" alt="Generic Sensor Model">
The {{Sensor}} object in "idle" [[#sensor-lifecycle|state]] is not among the [=platform sensor=]'s
[=activated sensor objects=] and thus it does not interact with the [=device sensor=].
In this example there is a [=platform sensor=] instance per [=browsing context=].
The [=latest reading=] [=ordered map|map=] is shared between {{Sensor}} objects from the
same [=browsing context|context=] and is updated at a rate equal to the requested [=sampling frequency=]
of the corresponding [=platform sensor=].
</div>
<h2 id="api">API</h2>
<h3 id="the-sensor-interface">The Sensor Interface</h3>
<pre class="idl">
[SecureContext, Exposed=(DedicatedWorker, Window)]
interface Sensor : EventTarget {
readonly attribute boolean activated;
readonly attribute boolean hasReading;
readonly attribute DOMHighResTimeStamp? timestamp;
undefined start();
undefined stop();
attribute EventHandler onreading;
attribute EventHandler onactivate;
attribute EventHandler onerror;
};
dictionary SensorOptions {
double frequency;
};
</pre>
A {{Sensor}} object has an associated [=platform sensor=].
Concrete {{Sensor}} objects also have an associated [=sensor type=], which is the [=sensor type=]
that has their [=interface=] among its [=extension sensor interfaces=].
The [=task source=] for the [=tasks=] mentioned in this specification is the <dfn>sensor task source</dfn>.
<div class="example">
In the following example, firstly, we check whether the user agent has permission to access
[=sensor readings=], then we construct accelerometer sensor and add
[=event listener|event listeners=] to get [=event|events=] for [=platform sensor=] activation,
error conditions and notifications about newly available [=sensor readings=]. The example
measures and logs maximum total acceleration of a device hosting the [=platform sensor=].
The [=event handler event types=] for the corresponding
[[#the-sensor-interface| Sensor Interface]]'s [=event handler=] attributes are defined in
[[#event-handlers|Event handlers]] section.
<pre highlight="js">
navigator.permissions.query({ name: 'accelerometer' }).then(result => {
if (result.state === 'denied') {
console.log('Permission to use accelerometer sensor is denied.');
return;
}
let acl = new Accelerometer({frequency: 30});
let max_magnitude = 0;
acl.addEventListener('activate', () => console.log('Ready to measure.'));
acl.addEventListener('error', error => console.log(\`Error: ${error.name}\`));
acl.addEventListener('reading', () => {
let magnitude = Math.hypot(acl.x, acl.y, acl.z);
if (magnitude > max_magnitude) {
max_magnitude = magnitude;
console.log(\`Max magnitude: ${max_magnitude} m/s2\`);
}
});
acl.start();
});
</pre>
</div>
### Sensor lifecycle ### {#sensor-lifecycle}
<style>
svg g.edge text {
font-size: 8px;
}
svg g.node text {
font-size: 10px;
}
</style>
<svg xmlns="http://www.w3.org/2000/svg" height="79pt" viewBox="0.00 0.00 351.00 78.51" width="351pt">
<style>
@media (prefers-color-scheme: dark) {
path,
polygon {
filter: invert(1);
}
text {
fill: #fff;
}
}
</style>
<g class="graph" transform="scale(1 1) rotate(0) translate(4 74.5122)">
<title>Sensor lifecycle</title>
<a xlink:href="#dom-sensor-state-slot">
<g class="node">
<title>idle</title>
<path d="M96.997,-64C96.997,-64 66.997,-64 66.997,-64 60.997,-64 54.997,-58 54.997,-52 54.997,-52 54.997,-36 54.997,-36 54.997,-30 60.997,-24 66.997,-24 66.997,-24 96.997,-24 96.997,-24 102.997,-24 108.997,-30 108.997,-36 108.997,-36 108.997,-52 108.997,-52 108.997,-58 102.997,-64 96.997,-64" fill="white" stroke="black"/>
<text text-anchor="middle" transform="translate(0,-2)" x="81.997" y="-41.2">idle</text>
</g>
</a>
<a xlink:href="#dom-sensor-state-slot">
<g class="node">
<title>activating</title>
<path d="M214.997,-64C214.997,-64 156.997,-64 156.997,-64 150.997,-64 144.997,-58 144.997,-52 144.997,-52 144.997,-36 144.997,-36 144.997,-30 150.997,-24 156.997,-24 156.997,-24 214.997,-24 214.997,-24 220.997,-24 226.997,-30 226.997,-36 226.997,-36 226.997,-52 226.997,-52 226.997,-58 220.997,-64 214.997,-64" fill="white" stroke="black"/>
<text text-anchor="middle" transform="translate(0,-2)" x="185.997" y="-41.2">activating</text>
</g>
</a>
<g class="edge">
<title>idle->activating</title>
<path d="M109,-38.0296C116.891,-37.4946 125.842,-37.2349 134.762,-37.2507" fill="none" stroke="black"/>
<polygon fill="black" points="144.762,-37.3855 134.702,-41.7502 139.762,-37.318 134.763,-37.2506 134.763,-37.2506 134.763,-37.2506 139.762,-37.318 134.823,-32.751 144.762,-37.3855 144.762,-37.3855" stroke="black"/>
<text text-anchor="middle" transform="translate(0,-4)" x="133.576" y="-20.9121">start()</text>
<a xlink:href="#sensor-start">
<text text-anchor="middle" transform="translate(0,-4)" x="133.576" y="-20.9121">start()</text>
</a>
</g>
<g class="edge">
<title>activating->idle</title>
<path d="M144.762,-50.6145C136.302,-50.8304 127.428,-50.7883 119.129,-50.4883" fill="none" stroke="black"/>
<polygon fill="black" points="109,-49.9704 119.217,-45.987 113.993,-50.2258 118.987,-50.4811 118.987,-50.4811 118.987,-50.4811 113.993,-50.2258 118.757,-54.9753 109,-49.9704 109,-49.9704" stroke="black"/>
<text text-anchor="middle" transform="translate(0,-4)" x="119.656" y="-59.3122">
<a xlink:href="#sensor-onerror">onerror</a>
</text>
</g>
<a xlink:href="#dom-sensor-state-slot">
<g class="node">
<title>activated</title>
<path d="M330.997,-40C330.997,-40 274.997,-40 274.997,-40 268.997,-40 262.997,-34 262.997,-28 262.997,-28 262.997,-12 262.997,-12 262.997,-6 268.997,-0 274.997,-0 274.997,-0 330.997,-0 330.997,-0 336.997,-0 342.997,-6 342.997,-12 342.997,-12 342.997,-28 342.997,-28 342.997,-34 336.997,-40 330.997,-40" fill="white" stroke="black"/>