Quantcast
Channel: The Things Network - Latest posts
Viewing all articles
Browse latest Browse all 116978

normalizeUplink vs the mandatory schema validation

$
0
0

Hello,

I am trying to understand as to why the validation of the output of the normalizeUplink() function against the Normalized Payload Schema is mandatory?

I really like the idea of the decoupling between decodeUplink() and normalizeUplink(). It allows the usage of a vendor provided decoder as-is but having my own normalizer to restructure the output the way I want it.

I am feeding an Elastic cluster through webhooks and the output of the decoder is not how I want the data structured. Without touching the decoder, I have 2 choices; either sending my data thought some intermediate ingest pipeline to reformat or using the normalizeUplink() function and keep all this inside ttn.

My problem is that by making the normalizeUplink() output validation against the schema mandatory this leads to two possible problems:

  • I am in the obligation of following that schema, although it may not be of value to me since this data is being fed into my infrastructure and I may not necessarily want/need to follow that schema.
  • The schema does not support some of the fields/metrics of the device I am using. This has just happened to me with the SenseCap 2120. That problem has also been discussed in SenseCAP S2120 8-in-1 payload edit.

Wouldn’t it make sense to either allow disabling the schema validation of the normalizeUplink() output or allow providing and alternate schema for validation?

If this makes sense, I’ll be happy to file an issue about it. LMKWYT.

Thanks,
Colin


Viewing all articles
Browse latest Browse all 116978

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>