Register for free and continue reading
Join our growing army of changemakers and get unlimited access to our premium content
Sense360 wants to make it easier for mobile app builders to determine which sensor data to apply, so they can enable automated user experiences.
Smartphones come with a variety of inbuilt sensors which can capture realtime data about the user’s activities, and sensor data is quickly becoming a discipline on its own. Addressing this is a startup Sense360, which aims to make it easier for mobile apps to use this data and allow for automated user experiences.
Sense360 collects all the data from smartphones’ sensors — including ambient light, accelerometer and gyroscope, and location — and processes it on behalf of its clients according to the app’s specific needs. The startup has been collaborating with a number of companies including Walla.by, ChangeCollective and Happinin in the lead up to their recent launch, experimenting with potential services.
To use, clients simply add a few lines of code to their backend, which enables Sense360 to access their raw data. It then applies algorithms to sort through the information and isolate the desired sensor data, to help the app developers provide new dimensions of smart customer service. Most promisingly, the platform is optimized to run in the background without draining battery power and operates as a blind middleman, processing the data without any awareness of who each user is.
Sense360 is another example of platforms seeking to make technologies interact more intuitively with each other, enhancing user experiences by requiring less action from them. The startup offer accounts starting from USD 99 per month, which allow up to 1 million event detections. What are some apps that can benefit from this tool?
Please login or Register to leave a comment.