Neutron is a distributed air quality testing platform. Originally this was born out of the desire to monitor the air quality in our own home, then the project expanded to do IAQ testing for homes as a service.
I am working on building the platform as a part of 100 Days of Code.
Neutron is the central server running InfluxDB, CouchDB and some helper scripts.
A Neutrino is a sensor node that runs a custom Python application to pull data from the environmental sensors on board and report that data back to the Neutron on network.
- Base Enclosure - PLA 1.75mm @ 50.15m
- Sensor Insert - PLA 1.75mm @ 32.38m
- Lid - PLA 1.75mm @ est 20m (Re-designing)
- Sensor Carriers - PLA 1.75mm @ 6.57m
100 Days of Neutron
- Just Getting Started
- Building a REST API
- Persistent Storage, Error Handling
- 3D Printing, Day 1
- 3D Printing, Day 2
- 3D Printing, Day 3
- 100 Days, of a passion project
- Fuck 3D Printing
- I need to get back to software
- Hexagon me up
- Almost there
- Sensor Insert and Carriers
- Correcting bad printing mistakes
- Thinking about the company
A Neutrino goes through the following start process:
- Connect to Neutron (CouchDB) and pull Neutrino configuration details
customer != "default"then test authorization to NeutronDB (InfluxDB)
- After successful authorization to NeutronDB, begin polling sensor readings
- Sensor readings every 10 seconds => Publish to NeutronDB
- Every 5 minutes, check for changes to customer assignment
When a Neutrino is assigned to a customer using the
/neutrino/<neutrino_id> POST, a few helper functions update the customer entry to have the
neutrino_id and then update the bucket authorizations to allow that Neutrino’s token to write.
It appears that 15 second resolution readings result in 0.5MB of storage used each day per data point. One house worth of units should come to (12) Neutrinos, each with about 20 data points. 120MB per day worth of readings for a house or 3.6GB per month per house. I will eventually need to look into better ways of storing the data.