LoRaWAN bit per second transmission rate is supposed to be between: 250 to 5470 bps.
So assuming you only have 30 seconds on air time every 24 hours, then a Node can send between 937 bytes and 20,512 bytes per day, meaning it could be as low as 39 bytes per hour. For a single 1-byte value sensor I can get a guaranteed remote sensor resolution of one reading every 2 minutes.
Is this calculation correct?
How much of this 30 second OTA time is being used up by protocol headers or is that handled outside of this fair use limit calculation?
I had thought that LoRaWAN enabled negotiation with devices to set their transmission behaviour. Surely, if the gateway is only seeing a small number of devices talking to it, it can inform them to transmit more frequently as there is much more available bandwidth, then as density of device in an area increases, so the devices on air schedule is managed and reduced across all of them, and of course you always leave a large percentage of unallocated air time; but I guess that is not how its going to work?