Millions of Americans have one thing easily in reach – their smartphone. The device they pick up has a good chance of being an Apple product. Today, the company announced a new initiative aimed at improving safety for children, but that good goal may come with unintended consequences for foster parents and other mandatory reporters.
Haven’t read up on it yet? Take a moment to educate yourself, then check back for some thoughts.
Foster parents are unique among group of individuals known as mandatory reporters in that they already live in glass houses, with many aspects of their lives open for inspection. Professionals including teachers, social workers and doctors as well as childcare professionals and foster parents are mandated by law in many states to report suspicions of abuse, neglect or mistreatment of children.
Making such allegations is a serious issue and must be backed up with documentation. While written journals and agency forms are often used, the best documentation is a photo or video. When suspicious marks or injuries are spotted, mandatory reporters often grab the high-definition camera closest to them located right on their phone or tablet.
In our personal experience, we’ve used photos to document suspected injuries on children returning from visits with parents. In some cases those digital files have helped substantiate an existing concern, leading to a changes in visitation. In one case, we had a child leave a treatment facility with more than 20 bruises on their body. Quick iPhone snaps allowed us to document the bruises inside the room where the child changed out of hospital clothing, thereby proving that the fresh wounds occurred in that facility.
Then, there are the normal kid bumps and bruises. Foster care and child welfare agencies routinely ask foster parent to text photos of cuts, scratches, bruises and skin irritations as soon as they are noticed to be included in the child’s file. This instant documentation protects foster parents and reduces agency risk. Sometimes incidents happens in a sensitive place on the child’s body. Under Apple’s new program, will a photo of extreme diaper rash cause this new protocol to kick into gear.
While no doubt the stated aim of Apple’s initiative is good – who wouldn’t want to intercept child pornography and seek justice for those kids – the parameters of the digital search is not publicly clarified, perhaps for good reason.
The computer program used to monitor photos puts great power in the hands of both company employees and contractors used to manage the program. That power raises the risk of abuse by the overseeing contractors Apple uses is high, but so is the risk of unintentional harm to caregivers.
Considering the ramifications of a false report for a foster family, the unknowns are enough to raise concern.
A false allegation of abuse can be devastating for a foster family. Upon receipt of such allegations, families can temporarily loose custody of their children, biologicals included, while investigations are complete. With overtaxed and understaffed social services agencies, those are not guaranteed to be swift. Allegations can also cause impacts with employers and at worse, the entrance of a foster parent to the legal system if the reports are severe enough.
There are bad apples in the foster care community just as in any other aspect of life, and a process to report suspected abuse is very much need. But for those who desire to serve their communities in this manner and build families during the process, blanket and wide-sweeping surveillance programs can give pause on everything from even considering becoming a foster family to what forms of technology they use. We as a nation already have a great shortage in foster families. This could be one more barrier to entry.
Other than carrying a standalone digital camera for documentation and then downloading and then uploading photos to email, or providing flash drives (remember those?) to workers, there are few options but to use the smartphone for the kind of instant reporting many agencies require.
Of course, companies like Apple can do what they wish with the products they sell. We are under no obligation to purchase their product. However, broad-stroke policies such as this may cause consumers pause to consider if their iPhone is worth the risk of false allegations should this new algorithm flag a photo captured to protect a child.
What are your thoughts? Let me know in the comments below.