Exclusive A Brit biz whose mobile apps monitor the mental state of 35,000 British schoolchildren is having to rewrite them after researchers found hardcoded login credentials within.
"Tracking steering biases is a pioneering technique developed by STEER using AI to identify patterns of bias linked to mental health risks in 10,000 test students," burbles the company's website.
Steer, a trading name of Mind.World Ltd, claims to have 150 subscribing schools. Included within the customer list on its website are British public schools such as Charterhouse, Fettes College, Oundle School and Wellington College.
Children enrolled in one of Steer's apps are labelled as red, amber or green depending on how it grades their mental health in response to questionnaires they fill out.
Its flagship tech, deployed under the brands AS Tracking and CAS Tracking, are sold to schools as a way of monitoring their pupils' wellbeing and allowing teachers to stage interventions.
"Since we began running AS Tracking we got a 20 per cent decrease in self harm at the college," a teacher told Sky News on Monday, for a video feature extolling the firm's virtues.
The company offers to let schools "track, signpost and support every pupil" through its apps, database functionality and analytics, boasting that by licensing each child for £18.75, schools can "benchmark" themselves "nationally".
Yet versions of the apps developed for pupils and teachers alike contain hardcoded login credentials – posing a security risk to the most sensitive mental health data and information of the most vulnerable children.
Keys to the kids' castle
Privacy advocate and internet troublemaker Gareth Llewellyn discovered the login credentials. Embedded within the Android version of AS Tracking were what appeared to be username and password pairs:
public static String AUTH_PASSWORD = "y2-@qtYg>*xFMQ)g"; public static String AUTH_USERNAME = "usteer"; public static final String API_BASIC_AUTH_PASSWORD = "Shi$i7eth7ae"; public static final String API_BASIC_AUTH_USERNAME = "Testing";
Once The Register persuaded Steer to take our concerns seriously, the company responded by promising to rewrite its apps to remove the hardcoded credentials and improve its general security. It said that the testing account was not used in practice and claimed that the "usteer" credential pair was a "legacy" login which has now been disabled.
The firm told us:
Data privacy and security are Steer's absolute priority so as soon as we were made aware of this potential issue, we started an investigation together with our third-party developers.
It's important to state that all data stored on our servers is encrypted, and not attributable to individuals: access IDs are encrypted, passwords are hashed, and this information is separated from the encrypted assessment data, which requires a separately stored algorithm (and other information) to interpret. Accordingly, while our investigation is ongoing, we do not believe that any sensitive data was accessible, or exposed.
Steer concluded: "We have removed the credential information flagged by The Register and are applying additional security measures as a precaution."
The good, the bad and the ugly
Duncan Brown, chief security strategist at web filtering biz Forcepoint, took a nuanced view of what happened here. He told The Reg: "The app developers are aiming to mitigate against the negative impact mental health issues can cause, and we should not shy away from using technology to assist people in understanding behavioural patterns.
"However, the security needs to be as robust as the science, particularly when dealing with such sensitive information held on minors. We should not avoid initiatives such as these simply to avoid potential privacy issues but privacy and security must be hardwired in: it's extraordinary that in 2019 developers are still hardcoding passwords."
Getting in a sneaky mention of his firm's current product lineup, which depends on analytics and analysis, Brown opined: "As usage of behavioural analytics grows, we as an industry need to improve governance, clearly lay out the purpose of any data collection – and of course ensure that any personal data collected remains completely secure."
Hardcoded credentials have caught out many a developer and company before. A smart home company called Zipato hardcoded the same private SSH key into all of its hubs, forcing the company to update all the devices and scrap the SSH tunnel. Similarly, Juniper Networks hardcoded login credentials into some of its data centre switches, a blunder discovered earlier this year.
Never one to be outdone in security snafus, Cisco admitted last year that it had not only left an undocumented root account in its video surveillance management software product, but that account also had hardcoded credentials.
The prevalence of these cockups doesn't clear these companies of wrongdoing. In the 21st century nothing should be deployed publicly with hardcoded creds. ®