Telemetry using AWS
Telemetry is an automated communication process used for remote and wireless data gathering. Data is obtained by establishing communication between all of the IoT sensors and actuators. To collect, transform, enrich and filter data. Then finally perform analytics on structured data.
Best Procedures To Follow While Using IoT in Telemetry Are:
- Collect parameters from all the sensors and combine the measurements obtained into a single message to receive time-series data to the cloud.
- Take readings every 5–10 seconds.
- Track data in Universal Time (UTC).
- Bidirectional Communication.
- It always a better option to use extra sensors according to the requirements.
IoT Core: It enables you to connect IoT devices like sensors to AWS cloud computing platform. Processing and routing of trillions of messages to AWS endpoints are possible in an efficient manner. The six main components of IoT Core are as follows:
- Identity and Access Management(IAM)
- Device gateway
- Message Broker
- Rules Engine
- Device Shadow
- The Registry
- CREATING THINGS IN AWS:
Log In to AWS services. Go to AWS IoT Core. Configure a device. Choose platform. Choose device SDK. Register a thing. Download connection kit. Configure and test the device.
2. SECURE THE SENSOR:
Go to IoT Core. Secure, edit the policy to subscribe to topics that thing requires. Edit policy document and paste it into policy created. Define a set of policy actions (Connect, Subscribe, and GetThingShadow).
3. TRANSFORM AND VISUALIZE DATA BY USING AWS IoT RULES ENGINE.
Add a rule that directs the MQTT topic to AWS IoT Analytics. The communication is thus established. Create a rule to direct data towards IoT Analytics. Query attributes. Add action. Test and resubscribe to verify the flow of data. Subscribe to a topic to see how data is transformed. This becomes a pathway from IoT Core to Analytics.
4. IoT Analytics:
It is a tool used for transforming and analyzing data by running queries using built-in SQL queries.
- Collect: Selecting specific data(raw and unprocessed) before publishing it to the pipeline.
- Pipeline: To process and filter data before storing.
- Data Store: It is a queryable repository and not a database.
- Dataset: SQL data set by performing SQL actions.
- Analyze: Query data by SQL query engine.
- Build: Build visualization and dashboard to get insights using AWS Quicksight.