We are searching data for your request:
Upon completion, a link will appear to access the found materials.
You're in the running for the job of your dreams. The final choice comes down to you and another candidate. You eventually get turned down for the job.
You beat yourself up about it and try to move on. But unbeknown to you, the employer passed you over because they found embarrassing stuff about you from when you were young thanks to your parents' frequent sharing of information about you on social media.
RELATED: 9 SAFETY TIPS TO PROTECT YOUR ONLINE DATA AND PRIVACY
This is a likely scenario for today's children, who are growing up in an age when many parents document their childhoods on Facebook, Instagram, Twitter, YouTube, and other social media. Many young ones may be startled to discover, one day, that the online evidence about them is as diverse as sonograms from when they were in their mothers' wombs to videos of their tantrums, posts about their illnesses, and about their likes and dislikes.
These children will grow up one day to find out that, somewhere in the vast amount of online data, there is enough information to create an accurate profile of them.
Sharenting—a portmanteau between (social media) "sharing" and "parenting"—can have more immediate effects than future job exclusion, Leah Plunkett, a law professor at the University of New Hampshire, argues in her recently released book Sharenthood: Why We Should Think Before We Talk About Our Kids Online.
The kind of information that can be gleaned from sharenting ranges from the silly and quirky to sensitive, like date of birth and place of residence, which can be used for identity theft, and even images that can be misappropriated for illegal purposes like child pornography.
One doesn't have to be a braggart on social media to engage in sharenting, for the practice creeps up on even the most discrete of parents. Opting into a fertility app, using a baby camera, and even storing baby photos in the cloud are all forms of sharenting because, in the best of cases, they turn children into data points that can be fed into a wide range of algorithms without their consent.
The issue with sharenting, Plunkett, who is a Harvard College and Law School alumna and also serves as a faculty associate at the institution's Berkman Klein Center for Internet and Society, explains in an email to Interesting Engineering, is that the practice is insufficiently regulated.
"In the United States, there is no comprehensive federal law on youth digital privacy. There are federal laws about student privacy (the Family Educational Rights and Privacy Act, the Children's Online Privacy Protection Act, and the Protection of Pupil Rights Act & Amendment), but these laws only apply to educational settings, not the many other settings where children and adolescents spend their time, like their homes," she writes.
In the absence of a legal framework, parents are left to serve as the primary gatekeepers for their children's digital lives, she adds. "This role is difficult for parents to fulfill meaningfully because the privacy policies, terms of service, and other tech provider terms that we all agree to when we young are hard, if not impossible, to understand."
Things may seem a little rosier across the Atlantic, where the European Union's General Data Protection Regulation (GDPR) affords some protection of children in the information society. Under article 8, providers of online services are required to obtain and verify the existence of parental consent before collecting data about children under the age of 16. And article 17 in the GDPR introduces the so-called "right to erasure" or "right to be forgotten", which empowers individuals to ask to have their online information deleted. If the data concerns children, online service providers are particularly urged to erase it under this article.
But such legal stipulations fall short of fully protecting children against sharenting, Plunkett argues.
"The law doesn't protect kids against the off-line dangers that could befall them by having their private information digitally exposed, such as abuse by a predatory adult who learns details about their lives from parents' social media and uses that information to target them," she writes.
Often times, the subjects of sharenting are too young to grant consent or understand what is happening. Besides, growing up in the age of the Internet means that the private spaces in which children can play and learn are shrinking.
As Plunkett explains, "even if children stay safe from potential predators and other [real-life] dangers, their sense of self is likely to be eroded if they lack private spaces to play. If parents are sharing everything they want, they are depriving their kids of private, protected spaces to play-- to make mischief, even mistakes, and to grow up better for having made them. If kids are made to feel self-conscious in their own homes and other intimate spaces, it becomes difficult them to chart their own course of identity formation."
Once the little ones grow up old enough to formulate their own opinions, they often take issue with their parents' online antics. A 2019 Microsoft survey covering teenagers aged 13 to 17 from 25 countries shows that 42% of them take issue with what their parents post about them on social media.
Jacqueline Beauchere, Microsoft's chief online safety officer, believes that the information gleaned from social media posts "can be misused in online social engineering schemes, culled together to make children and other young people the targets of online fraud or identity theft, or in extreme cases may even lead to online grooming."
If someone took out a line of credit in a child’s name, she contends, the child will likely not find out about it until they grow up and apply for their own credit cards or loans. Meanwhile, online grooming is the favored recruitment form for terrorist and extremist organizations, sex trafficking, and other illegal activities.
Of particular concern, Plunkett argues, are the cases of commercial sharenting. Getting paid for putting one's children on the internet–say, by shooting vlogs about one's children, as some parents do, and uploading them to YouTube–raises concerns not only about data protection, but also about the ethical aspects of children being used for income generation. Child labor, in the traditional sense, is prohibited by law. But what constitutes child labor in the age of social media is poorly understood, defined, and monitored.
At the opposite end of the spectrum, Plunkett contends, are those cases in which parents of children with illnesses or disabilities use social media to increase awareness about the conditions. The benefits of these types of uses outweigh the shortcomings associated with the loss of privacy.
If anecdotal evidence is anything to go by, more and more parents are becoming aware of the shortcomings of social media activity and more intentional about what and how they share. This trend has, in part, been driven by the repeated scandals involving Facebook and data privacy in recent years, which have made an increasing number of users rethink what they share on social media platforms.
For instance, some celebrities have taken to blurring their children's faces in social media posts and several of this author's friends have chosen to skip online sharing altogether in order to avoid giving their children an online identity before they are old enough to consent to having one.
Evidently, one can't avoid online exposure altogether in this day and age. For parents who want to protect their children online while keeping up with the times, Plunkett recommends to resort to "lower touch" ways of doing so.
"For example, storing photos and videos on a cloud-based service through an individual account is "lower touch" than putting those same images on social media. Yes, use of the cloud-storage service still relies on commercial digital tech to handle private data (images, in this case), but it doesn't broadcast the data to other people the way social media does. While there are still risks to privacy any time digital tech is used, using it in a way that allows for fewer sets of eyes on the data provides some privacy protection bump," she concludes.