With Katrina, people thought the meteorologic event was just a black swan. It was not, and we realized that earth is switching gears to a different era in climate change. Recent events, from September 2017, re-affirm a theory which was loudly shouted from the scientific community – humans are influencing the climate, for worse. The irony of it is that due to the process of validating any scientific work, the conclusions are reaching the policy makers and public, years later, which might be catastrophic if corroborated with the inertia of political class and the interference of election cycle – that’s another discussion topic.
In GPS tracking world, the majority of the black-box-providers are deploying a device which is mostly a store-and-forward type of communication device. If, for some reason, that black device is in the center of a Katrina area, where communication is totally interrupted, it will fill-up its internal buffer with the activities of the gear it is monitoring. Once the communication is restored, days, weeks later, the whole queue of unreported packets are dumped to the servers. And here is the dilemma – who should be throttling the traffic, the mobile provider or the end-recipient of the packets? It is the role of the company providing the GPS monitoring to get all the traffic and to handle it.
If you want an analogy in human terms, imagine your partner talking to you but you are not able to hear her for half a day, and all those words are being stored in a device, which once the channel communication is opened, will flood you.Can you imagine any problems?
There are several problems with it.
- Your normal capacity of handling communication is swamped. It is the New Year and Boxing Day season in a few hours. Are you prepared for that order of magnitude transactions? Is your “weakest-link” scalable to absorb the shock?
- Let’s assume you have a plan, and AWS is one answer by increasing your processing power by several orders. Handling incoming data is mostly a pipe-line processing in which some alarms and extra data is generating. How relevant is the incoming data, for the users? Imagine you have to update location on map, does it make sense to do it when data is stalled? So we are talking about triage in processing data. The system has to know what paths are valid in case of a flood.
- There is another scenario I want to bring up, emergency packets. Imagine one driver is in a life-threatening situation and employs what is called the “panic-button”. How is your system picking up that packet in the deluge of data and give priority?
- Coming back to the name of the post – Verizontrina. There are situation when the cell-towers are coming down, for an area, and suddenly all communication devices are entering the equivalent of a mobile Katrina. The packets destined to flow to servers are filling the internal buffers and even may be lost. There is a bandwidth limitation once the communication is restored and some of the packets, due to insufficient resources on the cell towers, are arriving later than others. Verizon or T-Mobile or AT&T cannot bring cells in the area to increase the bandwidth!
- All the data is not good at all for real-time processing due to the stale factor. If you have data correlation between packets based on time, the logic has to be changed. The user has to receive a different kind of output from the system. Did you have that discussion with your customers, have you been preparing them for that kind of event?
- The general SLAs are totally useless in those situations. Did you include those scenarios in the fine lines of your agreements?
As an asset tracking company you have to detect when geographic areas are going “blank”, for no discernible reason and prepare in advance for the flood. Prepare your customer to!