The IoT (Internet of Things) World Forum Architecture Committee (made up of Cisco, IBM, Rockwell Automation, Oracle, Intel and a variety of others) recently published an IoT reference model. It is probably fairer to call it a framework, since it is quite loose, but it is at least a starting point.
The reference model has seven tiers. Starting at the lowest tier there are physical devices and controllers (the things), then there is connectivity and, above that, edge computing where, for example, you might want to do some initial aggregation, de-duplication and analysis. These lower three levels can be considered operational technology (OT) whereas the remaining four levels are IT. The lowest level in the IT part of the stack is storage and this is succeeded in turn by data abstraction, applications, and collaboration and (business) processes.
I don’t think anyone could argue you with this model as a matter of principle but, of course, the devil is in the detail. The Committee called for standards across the different layers, especially in the OT layers where they don’t currently exist. While I agree with the sentiment I don’t see that happening anytime soon. Communication standards, in particular, are an issue. For example, at the sensor level, a lot of the providers of this equipment use proprietary protocols both because they represent differentiators and because they provide vendor lock-in. It seems like the sensor manufacturers haven’t heard about the UNIX wars!
At the next higher level there have been suggestions made by some commentators that RDF (resource description framework), which is a W3C standard, should be adopted. Most people I speak to seem to like the idea but don’t see it happening, although it is becoming more widely used in support of semantics: but that’s another story.
While we are talking about communications, the one thing I think that is clear is that you are going to end up with a distributed architecture and you aren’t going to want to move all of this information into a single place. For example, it will often be the case that only aggregated data is physically moved out of the OT layers into the IT layers with perhaps temporary storage of detailed data in OT for use in operational analytics. If that’s the case then data virtualisation is going to be a key technology, not just for federating data across IT layers but also to support the operational analytics just mentioned.