[FoRK] PubSub NG Re: MQTT : Exploring the Protocols of IoT - News - SparkFun Electronics
J. Andrew Rogers
andrew at jarbox.org
Sat Feb 28 13:18:19 PST 2015
> On Feb 27, 2015, at 6:53 PM, Stephen D. Williams <sdw at lig.net> wrote:
> Your example below regarding large networks of high-bandwidth endpoints is interesting, and certainly should be handled. But there may be vast numbers of a whole range of sensors, from a few bytes per second on up.
You can’t push down the requirements of high-rate sources onto low-rate devices and it defeats the purpose to limit a protocol to the capabilities of low-rate source devices. The protocols that exist are poor for the high-rate source cases *because* they were designed around the constraints of low-rate sources.
Optimize for the high-rate source case. If low-rate sources can work within that, great. If not then who cares, since there are already myriad protocols for the low-rate case. Keep it simple and focused on the problem that needs a solution.
>> Also, these sensor platforms are increasingly being built using the modern data center model: reliability through cheap, redundant units. If a unit fails, no one cares.
> Of course, for the unit, but you still want data to get through reliably.
Quantity has a quality all its own. You do not need the data off any particular device as long as you can get data off a subset of devices with data capable of measuring or reconstructing the value you are after.
For a variety of reasons, sensor data sources in the wild have unreliable availability even when they do not fail per se. In practice, this problem and the problem of data quality is addressed with information redundancy. Let the sensor sources vote on the state of reconstructed reality rather than trying to make a single data source infallible.
> Isn't it like the web in some cases? If I want to access the entry-way camera, flood sensor, etc. in my building from anywhere, but I should only get access if I'm verified as a resident?
Why would anyone need a new protocol to support this? Connecting to your smart toaster can be achieved with boring 1970s technology and is irrelevant to the task at hand.
> Interesting, but not the only use case. Nest thermostat, etc.
Data sources should not be conflated with devices; that is a Newtonian view of computing. The whole point of this is that many high-rate data sources can never be resolved to a device. Because physics.
The case where a data source always resolves to a device is the easy, narrow case that has been solved forever. This is about the more general case.
If I want to know a fact about the physical world in real-time, there is no requirement that I connect to a specific device as long as some set of devices can deliver the required measurements. The set of devices generating the data stream is constantly changing in practice. There is no ONE TRVE SENSOR providing these measurements. You are pushing your constraint into a dynamic cloud that in aggregate is generating the stream you see.
No one can consume the whole stream, so applications interact with it by steering constraints in the distributed stream to capture the bits they need. You can only consume as much of the stream as can be transported over your client’s wire but you can interact with the whole logical stream.
Computing is only approximately Newtonian in very small or very slow systems. This is neither.
More information about the FoRK