-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sensors, measurements, observations and part 2 (geo data aka collections) #147
Comments
Based on the discussion so far in aforementioned issue, there seems to be two suggested approaches to mapping a SensorThings API to collection(s):
I see the first approach as targeted to new clients and services being developed wishing to implement an OGC API - Features mechanism to retrieve sensor data, while keeping to the familiar SensorThings data model. While this should totally be possible if organizations wish to do so, I wonder what the advantages are over using the existing SensorThings API? The second approach I would myself encourage is intended as a bridge between the IoT and GIS world, for existing GIS clients to perform visualization and analysis. A link to the actual full SensorThings API is provided from that collection for SensorThings clients, which was really designed for sensor data and best handles the real-time nature and large quantity of information. But the data is also alternatively provided as a feature collection and/or a coverage, so that GIS clients can readily and easily visualize and operate on the collection of sensor information as a whole. If there are proponents of both approaches, then I believe both should be possible ways of integrating sensor information within an OGC API. |
Continuing off of @dblodgett-usgs comment on ObservationCollection, should also align with what we've been doing on the ObservationCollection in O&M V3 |
Could you please point me at 'O&M v3'? FYI, I helped start/design Sensor Web Enablement (SWE) and 'O&M v1' ... but that was a long while ago. Thanks and Regards, |
O&M V3 Work here: https://github.com/opengeospatial/om-swg |
Thanks. Reading 'Rename Sensor to Observer' and other topics... |
There should be an obvious binding to the SOSA ontology as an expression of the O&M data model for people wishing to use JSON-LD or other semantically explicit approaches. |
Since the community has well known encodings then an OGC API should be able to implement them. This reminds me of one of the Routing APIs and the Route Exchange Model. If you had to sketch a minimalist OGC API with just 1/2 cup of coffee ;-) one might start with a reference to O&M and some Requirements classes such as...
2a. A successful execution of the operation SHALL be reported as a response with a HTTP status code 200. 2b. The content of that response SHALL conform to a requirements class of the Observations Model. type: 2c. By default (and this requirements class provides no mechanism to change the default), the content SHALL conform to the requirements class "Observations Model (full)". 2d. If the request included an Accept-Language header, the server SHALL try to honor the request and otherwise fall back to an available language. 2e. At this point you realize the observation may not be able to represented as GeoJSON, what you just wrote is really dumb and that you need more coffee... Of course, I didn't use collections but you could (e.g. /collections/observations ) |
@jeffharrison Sorry for the added confusion, but we had some interesting discussions on how to split O&M V3 into requirements classes during today's SWG call. It's to do with the work on interfaces we've been doing in V3 allowing for much more abstraction if required. Led to the question of what blocks do we define as individual conformance classes for O&M. While very much a discussion in progress, it looks like we may be seriously disaggregating O&M, providing individual conformance classes for Observation, ObservableProperty, Process... I don't quite get the bit in 2e except for the needing more coffee ;) To my current understanding while collections may contain spatial features, there is no requirement that they do. Thus, the not being able to provide all collection contents as GeoJSON is a wider issue |
Going back to @jerstlouis' large post with the two models. One advantage I mentioned in the 140 discussion. Using this, it would be possible to combine STA with other feature collections in the same API. having collections for buildings, floors, rooms, vehicles, etc, and have all those seamlessly link to observations. The second options sounds like it is nice for generic GIS client, but in reality it doesn't work except in the simplest use case: each Datastream has only one Observation. Lets look at an example: our AirQuality ad-hoc, since it is one of the simpler use cases: Data: https://airquality-frost.k8s.ilt-dmz.iosb.fraunhofer.de/v1.1
A generic GIS client would have no problem displaying the Things/Locations or the FeaturesOfInterest, but what would you put in the properties? It's not feasible to just list all Observations for all Datastreams for each Thing; Way to much data. But every other selection is just arbitrary and will only fit specific use cases. For generic GIS clients, we just create a query that returns GeoJSON with just the data I want to display, like:
Even if you want to display a certain time interval, like the data for the last day, you can do that with just a static URL. So a generic GIS client doesn't need to know SensorThings. And in my opinion, OGC Api isn't done until it is possible to do something like this in OGC Api too. So I wouldn't focus too much on generic GIS clients, they won't be able to do much with the non-geo data any way. |
@hylkevds In my approach 2), you would be merging all observations from a single location & time in a single feature, regardless of how much data that is. And as a whole a collection represents all location and the entire duration for which observations have been collected. And Common methods for filtering by space & time are used to filter the data of interest by the regular GIS client, so I disagree with your statement that "it doesn't work except in the simplest use case", the complex use cases are what I am targeting. And in fact implementing this Features and/or Coverage bridge could be querying the SensorThings API behind the scene, much like you just showed. As for the other way around for adding entity types to SensorThings, could there also be a way to establish links from SensorThings entities to e.g. items of a Features API, where both can be used together for what they do best?
I believe they can do a lot if the data is organized as usual Feature collections or Coverages are. |
Are you sure you mean a single feature ? How would that work? GetFeature would return GBs of data! You'd need ways to filter the contents of the feature... And you'd have to know what you can filter on for each feature before getting the feature... And it would not just need filtering on space and time, but also other properties, least of which is ObservedProperty. |
@hylkevds all observations from a single location and time, that is a limited set of information, at least for a single feature, is it not? Without any spatial or temporal filtering, yes it would be GBs of data, but at worst the limit/next would kick in. GBs of data are a good thing if you want to perform interesting visualization and analysis. And of course you have the filtering. In Features, queryables and a filtering language (e.g. CQL) provide a mechanism to filter by data attributes (including a way to do do temporal filtering). And spatial partitioning like DGGS or Tiles or BBOX allows you to filter on a specific geospatial region. You can also use properties= to select which data attributes (observations) you are specifically interested in. And this GIS bridge can simply map those queries to a SensorThings API query. I would very much like to give you a very concrete example of this, including a full live demonstration of the concept in action, but this will take a bit of time. In fact IoT / GIS integration is one major topic of our collaborative project that we just started this week! I would be happy to keep exchanging with you on this and use cases, and share our progress. |
Ah, ok, a single location and time. And what do you do with all the other data? Duplicate the full ObservedProperty data for each Observation? And all Sensor metadata? And what about the case where the UltimateFeatureOfInterest and ProximateFeatureOfInterest are different? One is the river, the other the water sample from that river that the measurement was made on. |
@hylkevds In a Features API, yes it would mean duplicating the geometry for each observation. I am curious about sensors use cases where the feature is not just a point? Is that geometry ever dynamic in those cases? Even though a feature of interest might be a more complex geometry, potentially a sensor is situated at a specific location, and perhaps a separate collection with the more complex geometry could still be associated with it with some relational mechanism. As I mentioned in previous post, there is also a mechanism in Features to select which properties you're interested in (properties=), if you are only interested in a subset of the observations. Sensor metadata could also be available as separately available information, as it is not essential for it to be integrated with the geospatial aspect for the visualization & analytics use cases. The key here is that your observations and your geospatial features are integrated in a way allowing a typical GIS client to perform some interesting visualization and analytics, and leveraging the common spatio-temporal filtering mechanism (e.g. of the Features and Coverage API), while still having a real fully capable and efficient SensorThings API behind it all. |
Reading through the comments on Features API, observations and filtering. At first you might think it's a bit strange, and unworkable. However, I've helped design, develop and deploy 'REST' Feature services that did similar things with massive amounts of near real time data. Filters worked great. Not advocating for anything, just providing input from deployments. Best Regards, |
Ah, I think I'm beginning to understand what your ideas are. That would work indeed. For some domains it's actually not that far off from certain access patterns. In the water domain for instance, the FeatureOfInterest is the water sample, taken at a point location, with about 15 to 20 Observations made on that sample. So there already exists a FoI for each phenomenonTime and location combination. The "sample view" of this dataset is more of less what you describe. It probably would still push quite a bit more data over the line than needed, depending on the exact domain model, but that's not always a problem. I didn't know Features had the "properties=" query option, good that they have that too. I love it in STA. |
Another relevant topic is that of nested collections. The SensorThings API has the entity sets ("collections")
But also:
One could see that as a nested collection, but in reality it is not. It is a subset of the /Datastreams collection. An entity can not exist in such a nested entity set without being in a top-level entity set. Of course this only works because STA (OData) has entity linking as part of the core data model. |
From the perspective of conformance - if you wanted to state something about a service that supported a query that delivered a result set you need three things
we seem to keep conflating these issues, and its still not obvious to me exactly how we declare both the containing payload schema and the target feature model in a flexible way in the spirit of OGCAPI core... can someone provide an example to show how Observation features are contained in a predictable container model, and exactly where that container model is specified (is that OGCAPI- features, and the implication is that an O&M capable profile therefore has to be a profile of the Features profile of core? |
@jerstlouis I'd be VERY interested in your "very concrete example of this, including a full live demonstration" as I again fear we're talking across each other :( To what extent is your approach related to EDR, so just point at the space you want data for and get it? (whereby I'm currently at a loss as to WHAT EDR returns :( ) It's very hard for me to project what we're doing with STA into the OGC API world as I've yet to get a straight answer on if a collection can contain objects without a geometry. While @cportele had confirmed that non-spatial objects can comprise a collection, this seems to have been rolled back on #149
|
#149 restores that capability. Where Collections.adoc currently states
the PR changes this to
So, there may be entire collections without a data item with a spatial geometry - as long as there is some collection in the dataset that has a spatial extent (otherwise, why would OGC care?). |
@rob-metalinkage: I suppose the best example of such a thing would be if one takes the SensorThings API, and changes the urls to be more like OGC API. We're a looooong way away from an OGC API - O&M, especially with O&M being in flux right now, so it's way to soon to start talking conformance classes. I think the SensorThings API is also a great example of the fact that to get a nice, easy to use, powerful API one can not see those three issues as separate things. They strongly influence each other:
These are all simple rules that could be applied to any service that has a data model behind it. So yes, we're "conflating" these issues, because they have to be to get a good API... |
Per discussion in the OAB just now -- I want to bump this. It would be really nice to have some progress on how a SensorThings API (technically, I guess this means OData) endpoint(s) might be realized as a way to access data within an OGC API Collection. @cmheazel -- you put this as future work. Has there been any substantive chatter about this since 2020 that you are aware of? |
@dblodgett-usgs In terms of recent development, the Connected Systems and Moving Features API (itself an extension to OGC API - Features) may be related to this discussion. |
In my opinion the "datastream" concept in STA was created to "group" observations. Actually it contains the observations of the same sensor, the same thing and the same observedProperty. I believe it can be assimilated to the current concept of "collection". So, to me it will be possible to do: All the query characteristics of ODATA will be unavailable in this endpoint but, at least, we will have the dataset available as collections. And it is easy to make a OpenAPI description of this approach. I'm not talking about replacing the ODATA protocol but only complement it with an OpenAPI approach. I believe it is pointless to expose other objects endpoints; it will make the API unnecessarily complicated. |
If the goal is to have a STA endpoint merged into an existing OGC-Api endpoint it becomes a matter of:
The first is very domain specific. I think that in many cases it is sufficient to have the For the second point several options are possible. I'm not completely up-to-date on the OGC-Api spec, so I may have missed some:
I think that would be enough for an OGC-Api client to also navigate into the SensorThings API parts, and for a combined OGC-API & STA client to figure out which features can be used where. |
Let me clarify. As a producer of geospatial data services that include both sensor things and features, I want a way to expose multiple views of my datasets that are coherently arranged as views of my datasets. I find the OpenAPI / OData dichotomy to be a cumbersome and really just a technical detail that's confusing our client software developers. It would be awfully nice to accommodate both under the same top-level API landing page. |
reviewing the thread again is interesting :-) I still think the issue is the difference between containers and specialisations, and for different amount of semantics captured in the container model (feature, coverage, OData, Network or any other reusable pattern - we end up with different burdens on the domain model specialisation to capture the dimensions of the data. a single solution can be achieved by conflating these into a single API design, but the reality is that multiple solutions exist for good reasons, and we will need to separate the concerns through describing the relationship between a common conceptual model and these different, but equally valuable, implementation meta-models. We haven't had a canonical mechanism to describe these relationships - IMHO a legacy of UML and a tendency to build logical models and call them conceptual models. An activity to formalise a way to describe how different encoding meta-models map to an underlying data model could include machine-readable formalisms as well as validation. It is to this end we are exploring the using of JSON-LD contexts to "semantically uplift" different JSON schemas to an RDF form, which we can then apply validations, transformations and alignment models to. The challenge is to establish a vehicle whereby we can pool knowledge and effort rather than keep repeating specific viewpoints in the many places these issues keep cropping up. IMHO the OAB could define a set of model-agnostic requirements for APIs to support data models, and then apply STA to this as a test case, rather than argue around specific aspects of STA. The Network and moving features use cases are instructive - perhaps we should collect a set of functional Use Cases which we think require data models to be mapped to multiple API patterns. |
Another possibility will be that collections/{collectionId}/items/{itemID} respond in a GeoJSON that is a flat representation of the STA entity model: The FeatureOfInterest is the geometry, and the observation result, the dates, the sensor description, the observeProperty description, the thing, the uom and all the rest.... become properties. |
This is exactly what I'm trying to do by calling attention to this issue. I'm not really interested in how observations, sensors, things, features, etc. relate (I mean, I am, but it's not really my focus here). I'm interested in having a coherent OGC landing page for a dataset where we can list all the ways of accessing that dataset. |
How do we reconcile the "Web Flavor of SWE" with the OGC API suite, (+Building Blocks), the OGC-API-EDR and discussions around STA being an OGC API flavou? can we define a set of OGC API Building Blocks that cover the ISO 19156 model and have expressions in the different data models of the different APIS? does a SWE API perhaps need Geopose (which seems close to the SWE Sensor Model) as well - or need to be aligned to it ? FYI currently looking at how to create machine readable OGC API building block definitions - and scope of SOSA ontology update to match O&M v3, and how to create reusable building blocks for this and GeoPose |
I'm continuing my particular "thread" here. It is possible to have a single landing page with OGC API and STA since STA has only 2 elements in the landing page: "value" and "serverSettings" and they not interfere with tha OGC API ones. So an hybrid landing page will look like (OGC API first and STA later)
Now we only need to agree on what OGC API features could return in this case. Possibilities:
|
@joanma747 Thanks for sharing the prototype landing page. If STA offered a landing page like the one shown above, and also offered a Note that OGC API - Processes - Part 1: Core does not support OGC API - Common - Part 2: Geospatial Data. In other words, OGC API - Processes - Part 1: Core does not offer a |
Hi - been looking at the various specifications and trying to work out in practice how this will work. this may be clearer to others, but AFAICT there is provision to reference a JSON schema in the link section for both items (features) and collections (FG-JSON proposes this using a "describedBy" link relation) - and the "type" property linked to a "semantic description". This seems like a workable start - what we need to do then is link the json-schema to the semantics somehow - i.e. to let people know the observation relates to an OM_observation (presumably via sosa:Observation which provides a canonical URI and a semantic model using a canonical model form.) ?It doesnt seem that there is a way to link JSON schemas with JSON-LD contexts (or any other way to do this) that works to annotate schemas directly - only to annotate instance documents. The other piece that seems to be missing is a way to link a parameter in a query to the schema - again AFAICT this is a "black box" the server is responsible for. Its not obvious how to express that such a parameter is the same as a schema element - to meet the general model of filtering against a schema. If these are known and solved issues, they probably need to be promoted to a more obvious place in the documentation. Finally - is there any clarity regarding how to define a "building block" so that it can be integrated automatically into an OAS description - how can, for example, the STA model and required query parameters be defined so that they can be somehow imported into an API, without writing a new document for each possible case? The OAS extensions mechanism at https://swagger.io/docs/specification/openapi-extensions/ doesnt provide any mechanism to plugin reusable definitions - only what they should look like in-situ. I feel we should take the basic architecture discussions offline somewhere else, but not sure where or if they are being addressed. |
@rob-metalinkage, There is a "$ref" mechanisms that can link to external documents that contains "shared" definitions. In the OGC API Tiles schemas @jerstlouis did a careful implementation of them that goes even further than the one in OGC API tiles (see https://schemas.opengis.net/ogcapi/tiles/part1/1.0/openapi/). I wonder if this is what you are looking for in your last paragraph. |
Perhaps some inspiration? https://docs.oasis-open.org/odata/odata-openapi/v1.0/odata-openapi-v1.0.html |
@rob-metalinkage More details about the approach mentioned by @joanma747 to define OpenAPI building blocks in such a way that they can easily be tailored and assembled for a particular API definition based on supported capabilities are described in #302 . The same approach is now also used in OGC API - Tiles, Coverages, Maps, DGGS, Coverages and Processes. A top-level .yaml file simply includes one file for each supported API path, then |
Sorry for jumping in late here, been a crazy week! Cool thread! On @cportele statement:
As OMS Observations (core model under STA) are geared to provide information on existing features, where would these be exposed? Separate Collection? How would we provide a link from these features to Observations thereon?
YES!!! Finding a way to indicate how the conceptual models created within the OGC align with implementations would be wonderful!!! Such a mechanism would be valuable far beyond the OMS/STA context being discussed here. Rob, do you think that this could be included in your slot in the Conceptual Modelling Group on Report on Note: mapping to SOSA could cause issues, as not quite aligned. Staying within the OGC context, maybe better to get the modspec bits on OMS running? From there we could then provide suitable links to SOSA, where not quite aligned indicating this. On the more general request I see here for a simplified or flattened access to OMS-Type data, we've been discussing this issue in the O&M SWG for a while now. To our view, it would be valuable to, in addition to STA, provide a linked simple access endpoint, whereby these simple and complex versions should be aware of each other, provide links. We're still chewing on possible representations for such a simple
I'm still not sure where the links would go in an OAPI-F or EDR approach, links to the corresponding datastreams should be provided either within the properties, alternatively in the links section of the API. I've been working on defining the transformation utilized here with colleagues working under the CODATA DDI-CDI work, as nobody I asked could provide a correct term for this type of transmogrification (shifting the content of the ObservableProperty name to an attribute of the simplified data model). The paper will be published in IASSIST Quarterly this March, here the last draft of Viewpoints on Datapoints Finally, on merging STA within a wider OGC-API, I do pity the poor developer confronted with an API that drastically modifies it's request structure for one specific branch of the wider API endpoints. |
Thanks all for the pointers - we are working (this weekend :-) ) to develop a PR for an better building block description template with machine readable components to implement a specific subtype of Observation as a GeoJSON (and FG-JSON) feature, predicated on using interacting building blocks for Observation, a specialised form of Observation that defines the result schema, and a result schema using a third building block (GeoPose). This stuff will require much testing and consensus building in upcoming testbeds, but we think we can demonstrate how we can support the publication process with tooling that makes a better practice more visible and feasible. |
PS - there will be a proposal for a SWG in the Agriculture DWG to work on a domain model for agriculture (AIM) that will include in its scope exploration of OGC API implementation - given that its based on SOSA, PROVm GeoSPARQL etc this activity will provide ample opportunity to test the ideas of reusable building blocks. As @KathiSchleidt points out however the commonality of semantics across different representational views is the issue here. There is a proposal from the FG-JSON space to reference both the "semantic type" and the schema ("describedBy") - so we could bridge this gap this way - but thats where SOSA vs OMS becomes a thing - SOSA is something we can reference - and Observation has a stable URI... OMS (and currently most ISO things and all other UML artefacts) are opaque to machine readability and cannot be referenced in a way that adds any direct value for interoperabilty. So for our experiment we'll use SOSA and assume that SOSA V2 will retain backwards compatibility and support OMS semantics with canonical identifiers. |
I pity the poor developer who has to figure out that two services offer the same data even though they have completely different landing page and top level metadata. We're putting two incompatible technology carts in front of a single horse and expecting things to go in the right direction. |
I tend to agreed with @dblodgett-usgs here. Most clients will not be fetching data from both APIs. I think it's more important to clearly communicate that the different APIs serve the same data set, and that is easiest done by serving them from a common landing page. Ideally, clients will look at the list of conformance classes, and from the ones they recognise know which paths to use. Within the OGC-API family this already happens. A Features client won't know what to do with tiles, but that's fine, since the Features client won't request data from the Tiles paths. Similarly, a STA client will just send its requests to the STA paths, and a (theoretical future) OGC-API-Enabled OData client would only access the OData paths. It's not so much about making sure every client will magically know how to deal with every path that is offered, and more about making sure that every client knows how to find those paths that offer the data in the way it knows. This is of course a topic totally separate from the other topic mixed in this thread: How to offer OMS like data in a Features like API. Maybe we need to split this Issue into two separate ones? |
Agree we are falling over a few different things here :-) At one level it is not at all clear there is a common agreement what a building block is - @jerstlouis thanks for that pointer - certainly thats one model that covers some of the building blocks being proposed - but its a bit upside down, in the methodology is being described in the specialised extensions not the core AFAICT. (ok its evolving - and something needs to be pushed back to the core) There are some design contracts here - in that APIs cannot have paths that clash for the composition/mix-in pattern to work - does swagger_cli check this and reject attempts to bundle clashing building blocks - at any rate the OGC infrastructure needs to check this automatically and reject building block proposals that cannot be composed this way. Finally, its not just the client choosing the pathways - its the client knowing which pathways a particular data source is available from - for example you might do a map search to find sensors using API-features, but then access the data streams from these using STA - its still not clear to me how we can declare these are the same. The /collections path allows for links to schemas using "describedBy" relation - its perfectly possible to have different data models for the same data both described by geojson schema - so this seems weak. is it as simple as a contract that "describedBy" has to reference a specialised schema for a distinct meta-model for each API pathway - and does this imply further restrictions on naming (and validity checking we need to do at the building block register end ?) |
@joanma747: Merging the STA
Would that work? The |
@hylkevds, looks like a nice alternative to me. Using on a "version number" to identify the STA interface seems a bit odd to me so. I'll propose a small modification to it:
The /api interface does not have to describe the STA interface necessarily. Instead it should provide a collections endpoint for sensor data in GeoJSON or STAJSON (with datasteam as starting point). |
@hylkevds @joanma747 : any update on this approach? I'm wondering if we couldn't integrate this within some of the currently running IEs, in order to bridge the gap between the requirement for simple access to features vs. the requirements for provision of measurement data. |
This was brought up as a critical need for the hydrology domain at a session today. I think the discussion above is pointing to a good candidate approach to include STA in the context of the OGC-API landing page. How can we move this forward? Is there a contribution to be made to one of the API Common artifacts? |
I am interested in the outcome of this discussion as well! |
Please note also https://github.com/opengeospatial/ogcapi-sosa which is where a binding of OMS and SOSA to the GeoJSON/FG-JSON feature schemas (API-Features compatible) is being developed. |
Since Connected Systems is taking on this aspect, let's continue the discussion in terms of whether Part 2 is suitable for integration with Connected Systems. |
ConnectedSystems has a different scope, that extend OMS concepts, but should be based on a canonical approach to OMS. It is however following the current sub-optimal approach encouraged by OAS 3.0, of copy and paste elements of related specifications in local copies of schemas, leaving no trace of interoperability expectations beyond descriptive text. ConnectedSystems is aiming at a OAS 3.1 baseline, which does not need to do this, however without a canonical OMS based schema this is still not transparent. Since it is based on OAS 3.1 and current versions of JSON schema it is not intended to be compatible with OGC API common in its current version, and cannot be regarded as providing a solution. For a deeper understanding of the issues read this blog post from one of the JSON schema and OAS technical leads: https://modern-json-schema.com/your-path-from-openapi-30-to-31-and-beyond. the OMS SWG is the relevant scope for this, and is working with the OGC/W3C Spatial Data on the Web Working Group (SDW) on an update to the SOSA ontology and has flagged work items for JSON encoding and Observable Properties descriptions. The SDW is undergoing a re-charter exercise, so any specific requirements Re-noting, a OGC API common compatible solution is in draft form for key concerns around testing the ObservationCollections concepts as an implementation of the SOSA v2 model. This particular issue should be decomposed into a set of relevant concerns: firstly, the original issue description is relevant - its critically important not to just accrete partial solutions into a common, complex and unstable specification, but instead ensure the profiling and extension mechanisms are clear and tested and domain specific specialisations can be easily created with transparent and testable interoperability with the core. secondly, transition from OAS 3.0 baseline to future versions needs to be understood and specification development processes tested to support this. thirdly, systematically applying the lessons from the first two concerns by refactoring or updating "monolithic" specifications. Where these have been defined using "cut and paste" approach to interoperability on some aspects this may mean publishing new versions, or possibly just defining OGC BuildingBlocks to package them better, that make the interoperability intentions clear and validate all examples and artefacts to ensure the specifications meet the requirements stated. |
In an attempt to offload the sensors-specific portion of an already lengthy and complex discussion of the overall "collections discussion" in #140, we can continue the discussion here.
The text was updated successfully, but these errors were encountered: