Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Instagram Introduces Teen Accounts, Other Sweeping Changes to Boost Child Safety Online

17 September 2024 at 12:40
Instagram Teen Accounts

Instagram is introducing separate teen accounts for those under 18 as it tries to make the platform safer for children amid a growing backlash against how social media affects young people’s lives.

Beginning Tuesday in the U.S., U.K., Canada and Australia, anyone under under 18 who signs up for Instagram will be placed into a teen account and those with existing accounts will be migrated over the next 60 days. Teens in the European Union will see their accounts adjusted later this year.

[time-brightcove not-tgx=”true”]

Meta acknowledges that teenagers may lie about their age and says it will require them to verify their ages in more instances, like if they try to create a new account with an adult birthday. The Menlo Park, California company also said it is building technology that proactively finds teen accounts that pretend to be grownups and automatically places them into the restricted teen accounts.

Read More: The U.S. Surgeon General Fears Social Media Is Harming the ‘Well-Being of Our Children’

The teen accounts will be private by default. Private messages are restricted so teens can only receive them from people they follow or are already connected to. “Sensitive content,” such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said. Teens will also get notifications if they are on Instagram for more than 60 minutes and a “sleep mode” will be enabled that turns off notifications and sends auto-replies to direct messages from 10 p.m. until 7 a.m.

While these settings will be turned on for all teens, 16 and 17-year-olds will be able to turn them off. Kids under 16 will need their parents’ permission to do so.

“The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see or that they’re getting contacted by people they don’t want to be contacted by or that they’re spending too much on the app,” said Naomi Gleit, head of product at Meta. “So teen accounts is really focused on addressing those three concerns.”

The announcement comes as the company faces lawsuits from dozens of U.S. states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.

In the past, Meta’s efforts at addressing teen safety and mental health on its platforms have been met with criticism that the changes don’t go far enough. For instance, while kids will get a notification when they’ve spent 60 minutes on the app, they will be able to bypass it and continue scrolling.

That’s unless the child’s parents turn on “parental supervision” mode, where parents can limit teens’ time on Instagram to a specific amount of time, such as 15 minutes.

With the latest changes, Meta is giving parents more options to oversee their kids’ accounts. Those under 16 will need a parent or guardian’s permission to change their settings to less restrictive ones. They can do this by setting up “parental supervision” on their accounts and connecting them to a parent or guardian.

Nick Clegg, Meta’s president of global affairs, said last week that parents don’t use the parental controls the company has introduced in recent years.

Gleit said she thinks teen accounts will create a “big incentive for parents and teens to set up parental supervision.”

“Parents will be able to see, via the family center, who is messaging their teen and hopefully have a conversation with their teen,” she said. “If there is bullying or harassment happening, parents will have visibility into who their teen’s following, who’s following their teen, who their teen has messaged in the past seven days and hopefully have some of these conversations and help them navigate these really difficult situations online.”

U.S. Surgeon General Vivek Murthy said last year that tech companies put too much on parents when it comes to keeping children safe on social media.

“We’re asking parents to manage a technology that’s rapidly evolving that fundamentally changes how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that prior generations never had to manage,” Murthy said in May 2023.

How One Brazilian Judge Could Suspend Elon Musk’s X

Elon Musk

SAO PAULO — It’s a showdown between the world’s richest man and a Brazilian Supreme Court justice.

The justice, Alexandre de Moraes, has threatened to suspend social media giant X nationwide if its billionaire owner Elon Musk doesn’t swiftly comply with one of his orders. Musk has responded with insults, including calling de Moraes a “tyrant” and “a dictator.”

[time-brightcove not-tgx=”true”]

It is the latest chapter in the monthslong feud between the two men over free speech, far-right accounts and misinformation. Many in Brazil are waiting and watching to see if either man will blink.

What is the basis for de Moraes’ threat?

Earlier this month, X removed its legal representative from Brazil on the grounds that de Moraes had threatened her with arrest. On Wednesday night at 8:07 p.m. local time (7:07 p.m. Eastern Standard Time), de Moraes gave the platform 24 hours to appoint a new representative, or face a shutdown until his order is met.

De Moraes’ order is based on Brazilian law requiring foreign companies to have legal representation to operate in the country, according to the Supreme Court’s press office. This ensures someone can be notified of legal decisions and is qualified to take any requisite action.

X’s refusal to appoint a legal representative would be particularly problematic ahead of Brazil’s October municipal elections, with a churn of fake news expected, said Luca Belli, coordinator of the Technology and Society Center at the Getulio Vargas Foundation, a university in Rio de Janeiro. Takedown orders are common during campaigns, and not having someone to receive legal notices would make timely compliance impossible.

“Until last week, 10 days ago, there was an office here, so this problem didn’t exist. Now there’s nothing. Look at the example of Telegram: Telegram doesn’t have an office here, it has about 50 employees in the whole world. But it has a legal representative,” Belli, who is also a professor at the university’s law school, told The Associated Press.

Does a single judge really have that much power?

Any Brazilian judge has the authority to enforce compliance with decisions. Such measures can range from lenient actions like fines to more severe penalties, such as suspension, said Carlos Affonso Souza, a lawyer and director of the Institute for Technology and Society, a Rio-based think tank.

Lone Brazilian judges shut down Meta’s WhatsApp, the nation’s most widely used messaging app, several times in 2015 and 2016 due to the company’s refusal to comply with police requests for user data. In 2022, de Moraes threatened the messaging app Telegram with a nationwide shutdown, arguing it had repeatedly ignored Brazilian authorities’ requests to block profiles and provide information. He ordered Telegram to appoint a local representative; the company ultimately complied and stayed online.

Affonso Souza added that an individual judge’s ruling to shut down a platform with so many users would likely be assessed at a later date by the Supreme Court’s full bench.

How would de Moraes suspend X?

De Moraes would first notify the nation’s telecommunications regulator, Anatel, who would then instruct operators — including Musk’s own Starlink internet service provider — to suspend users’ access to X. That includes preventing the resolution of X’s website — the term for conversion of a domain name to an IP address — and blocking access to the IP address of X’s servers from inside Brazilian territory, according to Belli.

Given that operators are aware of the widely publicized standoff and their obligation to comply with an order from de Moraes, plus the fact doing so isn’t complicated, X could be offline in Brazil as early as 12 hours after receiving their instructions, Belli said.

Since X is widely accessed via mobile phones, de Moraes is also likely to notify major app stores to stop offering X in Brazil, said Affonso Souza. Another possible — but highly controversial — step would be prohibiting access with virtual private networks ( VPNs) and imposing fines on those who use them to access X, he added.

Has X been shut down in other countries?

X and its former incarnation, Twitter, are banned in several countries — mostly authoritarian regimes such as Russia, China, Iran, Myanmar, North Korea, Venezuela and Turkmenistan.

China banned X when it was still called Twitter back in 2009, along with Facebook. In Russia, authorities expanded their crackdown on dissent and free media after Russian President Vladimir Putin sent troops into Ukraine in February 2022. They have blocked multiple independent Russian-language media outlets critical of the Kremlin, and cut access to Twitter, which later became X, as well as Meta’s Facebook and Instagram.

In 2009, Twitter became an essential communications tool in Iran after the country’s government cracked down on traditional media after a disputed presidential election. Tech-savvy Iranians took to Twitter to organize protests. The government subsequently banned the platform, along with Facebook.

Other countries, such as Pakistan, Turkey and Egypt, have also temporarily suspended X before, usually to quell dissent and unrest. Twitter was banned in Egypt after the Arab Spring uprisings, which some dubbed the “Twitter revolution,” but it has since been restored.

Why is Brazil so important to X and Musk?

Brazil is a key market for X and other platforms. Some 40 million Brazilians, roughly one-fifth of the population, access X at least once per month, according to the market research group Emarketer. Musk, a self-described “free speech absolutist,” has claimed de Moraes’ actions amount to censorship and rallied support from Brazil’s political right. He has also said that he wants his platform to be a “global town square” where information flows freely. The loss of the Brazilian market — the world’s fourth-biggest democracy — would make achieving this goal more difficult.

Brazil is also a potentially huge growth market for Musk’s satellite company, Starlink, given its vast territory and spotty internet service in far-flung areas.

Late Thursday afternoon, Starlink said on X that de Moraes this week froze its finances, preventing it from doing any transactions in the country where it has more than 250,000 customers.

“This order is based on an unfounded determination that Starlink should be responsible for the fines levied — unconstitutionally — against X. It was issued in secret and without affording Starlink any of the due process of law guaranteed by the Constitution of Brazil. We intend to address the matter legally,” Starlink said in its statement.

Musk replied to people sharing the earlier reports of the freeze, adding his own insults directed at de Moraes.

“This guy @Alexandre is an outright criminal of the worst kind, masquerading as a judge,” he wrote.

De Moraes’ defenders have said his actions have been lawful, supported by most of the court’s full bench and have served to protect democracy at a time in which it is imperiled.

In April, de Moraes included Musk as a target in an ongoing investigation over the dissemination of fake news and opened a separate investigation into the executive for alleged obstruction.

Will X appoint a new legal representative in Brazil?

X said Thursday in a statement that it expects its service to be shutdown in Brazil.

“Unlike other social media and technology platforms, we will not comply in secret with illegal orders,” it said. “To our users in Brazil and around the world, X remains committed to protecting your freedom of speech.”

It also said de Moraes’ colleagues on the Supreme Court “are either unwilling or unable to stand up to him.”

What Is Telegram and Why Was Its CEO Arrested in Paris?

27 August 2024 at 09:42
In this photo illustration, the online chat and

Pavel Durov, the founder and CEO of the messaging app Telegram, was arrested in Paris over the weekend over allegations that his platform is being used for illicit activity such as drug trafficking and the distribution of child sexual abuse images.

Durov, who was born in Russia, spent much of his childhood in Italy and is a citizen of France, Russia, the Caribbean island nation of St. Kitts and Nevis and the United Arab Emirates. He was taken into custody at Paris-Le Bourget Airport in France on Saturday after landing from Azerbaijan.

[time-brightcove not-tgx=”true”]

In a statement posted to its platform, Telegram said it abides by EU laws and its content moderation is “within industry standards and constantly improving.” Durov, the company added, “has nothing to hide and travels frequently in Europe.”

Here are some details on Telegram, the app at the center of Durov’s arrest.

What is Telegram?

Telegram is an app that allows for one-on-one conversations, group chats and large “channels” that let people broadcast messages to subscribers. Unlike rivals such as Meta’s WhatsApp, Telegram’s group chats allow as many as 200,000 people, compared to a maximum of 1,024 for WhatsApp. Experts have raised concerns that misinformation spreads easily in group chats of this size.

Telegram offers encryption for their communications, but — contrary to a popular misconception — this feature is not on by default. Users have to switch on the option to encrypt their chats. It also doesn’t work with group chats. That contrasts with rival Signal and Facebook Messenger, where chats are encrypted end-to-end by default.

Telegram says it has more than 950 million active users. It is widely used in France as a messaging tool, including by some officials in the presidential palace and in the ministry behind the investigation into Durov. But French investigators have also found the app has been used by Islamic extremists and drug traffickers.

Telegram was launched in 2013 by Durov and his brother Nikolai. According to Telegram, Pavel Durov supports the app “financially and ideologically while Nikolai’s input is technological.”

Before Telegram, Durov founded VKontakte, Russia’s largest social network. The company came under pressure amid the Russian government’s crackdown after mass pro-democracy protests rocked Moscow at the end of 2011 and 2012. Durov said government authorities demanded that the VKontakte take down the online communities of Russian opposition activists. It later asked the platform to hand over the personal data of users who took part in the 2013 uprising in Ukraine, which eventually ousted a pro-Kremlin president.

Read More: How Telegram Became the Digital Battlefield in the Russia-Ukraine War

But Durov sold his stake in VKontakte after pressure from Russian authorities in 2014. He also left the country. Today, Telegram is based in Dubai, which Durov called “the best place for a neutral platform like ours to be in if we want to make sure we can defend our users’ privacy and freedom of speech” in an April interview with conservative talk show host Tucker Carlson.

Why was Durov arrested?

Durov was detained in France as part of a judicial inquiry opened last month involving 12 alleged criminal violations, according to the Paris prosecutor’s office. It said the suspected violations include complicity in selling child sexual abuse material and in drug trafficking, fraud, abetting organized crime transactions and refusing to share information or documents with investigators when required by law.

As of Tuesday morning, he had not been charged. He can be held for questioning until Wednesday evening, at which point judges must either charge him or release him.

What has been the response?

In Russia, Kremlin spokesman Dmitry Peskov declined to comment on reports of Durov’s arrest in France.

“We still don’t know what exactly Durov is being accused of,” Peskov said Monday during his daily media conference call. “We haven’t heard any official statements on that matter.”

“Let’s wait until the charges are announced — if they are announced,” Peskov said.

Russian government officials have expressed outrage at Durov’s detention, with some calling it politically motivated and proof of the West’s double standard on freedom of speech. The outcry has raised eyebrows among Kremlin critics: in 2018 Russian authorities themselves tried to block Telegram but failed, withdrawing the ban in 2020.

Elsewhere, Elon Musk, the billionaire owner of X who has called himself a “ free speech absolutist,” has been speaking out in support of Durov and posted ”#freePavel” following the arrest.

“It is absurd to claim that a platform or its owner are responsible for abuse of that platform,” Telegram’s post after the arrest said. “Almost a billion users globally use Telegram as a means of communication and as a source of vital information. We’re awaiting a prompt resolution of this situation. Telegram is with you all.”

Does Telegram moderate content?

Western governments have often criticized Telegram for a lack of content moderation, which experts say opens up the messaging platform for potential use in money laundering, drug trafficking and the sharing of material linked to the sexual exploitation of minors.

Compared to other messaging platforms, Telegram is “less secure (and) more lax in terms of policy and detection of illegal content,” said David Thiel, a Stanford University researcher, who has investigated the use of online platforms for child exploitation, at its Internet Observatory.

In addition, Telegram “appears basically unresponsive to law enforcement,” Thiel said, adding that messaging service WhatsApp “submitted over 1.3 million CyberTipline reports in 2023 (and) Telegram submits none.”

In 2022, Germany issued fines of 5.125 million euros ($5 million) against the operators of Telegram for failing to comply with German law. The Federal Office of Justice said that Telegram FZ-LLC hasn’t established a lawful way for reporting illegal content or named an entity in Germany to receive official communication.

Both are required under German laws that regulate large online platforms.

Last year, Brazil temporarily suspended Telegram over its failure to surrender data on neo-Nazi activity related to a police inquiry into school shootings in November.

___

Associated Press Writers Barbara Surk in Nice, France, and Daria Litvinova in Tallinn, Estonia contributed to this story.

What to Know About the Kids Online Safety Act and Its Chances of Passing

21 July 2024 at 13:31
Congress Kids Online Safety

The last time Congress passed a law to protect children on the internet was in 1998 — before Facebook, before the iPhone and long before today’s oldest teenagers were born. Now, a bill aiming to protect kids from the harms of social media, gaming sites and other online platforms appears to have enough bipartisan support to pass, though whether it actually will remains uncertain.

[time-brightcove not-tgx=”true”]

Supporters, however, hope it will come to a vote later this month.

Proponents of the Kids Online Safety Act include parents’ groups and children’s advocacy organizations as well as companies like Microsoft, X and Snap. They say the bill is a necessary first step in regulating tech companies and requiring them to protect children from dangerous online content and take responsibility for the harm their platforms can cause.

Opponents, however, fear KOSA would violate the First Amendment and harm vulnerable kids who wouldn’t be able to access information on LGBTQ issues or reproductive rights — although the bill has been revised to address many of those concerns, and major LGBTQ groups have decided to support of the proposed legislation.

Here is what to know about KOSA and the likelihood of it going into effect.

What would KOSA do?

If passed, KOSA would create a “duty of care” — a legal term that requires companies to take reasonable steps to prevent harm — for online platforms minors will likely use.

They would have to “prevent and mitigate” harms to children, including bullying and violence, the promotion of suicide, eating disorders, substance abuse, sexual exploitation and advertisements for illegal products such as narcotics, tobacco or alcohol.

Social media platforms would also have to provide minors with options to protect their information, disable addictive product features, and opt out of personalized algorithmic recommendations. They would also be required to limit other users from communicating with children and limit features that “increase, sustain, or extend the use” of the platform — such as autoplay for videos or platform rewards. In general, online platforms would have to default to the safest settings possible for accounts it believes belong to minors.

“So many of the harms that young people experience online and on social media are the result of deliberate design choices that these companies make,” said Josh Golin, executive director of Fairplay, a nonprofit working to insulate children from commercialization, marketing and harms from Big Tech.

How would it be enforced?

An earlier version of the bill empowered state attorneys general to enforce KOSA’s “duty of care” provision but after concerns from LGBTQ groups and others who worried they could use this to censor information about LGBTQ or reproductive issues. In the updated version, state attorneys general can still enforce other provisions but not the “duty of care” standard.

Broader enforcement would fall to the Federal Trade Commission, which would have oversight over what types of content is “harmful” to children.

Who supports it?

KOSA is supported a broad range of nonprofits, tech accountability and parent groups and pediatricians such as the American Academy of Pediatrics, the American Federation of Teachers, Common Sense Media, Fairplay, The Real Facebook Oversight Board and the NAACP. Some prominent tech companies, including Microsoft, X and Snap, have also signed on. Meta Platforms, which owns Facebook, Instagram and WhatsApp, has not come out in firm support or opposition of the bill, although it has said in the past that it supports the regulation of social media.

ParentSOS, a group of some 20 parents who have lost children to harm caused by social media, has also been campaigning for the bill’s passage. One of those parents is Julienne Anderson, whose 17-year-old daughter died in 2022 after purchasing tainted drugs through Instagram.

“We should not bear the entire responsibility of keeping our children safe online,” she said. “Every other industry has been regulated. And I’m sure you’ve heard this all the time. From toys to movies to music to, cars to everything. We have regulations in place to keep our children safe. And this, this is a product that they have created and distributed and yet over all these years, since the ’90s, there hasn’t been any legislation regulating the industry.”

KOSA was introduced in 2022 by Senators Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn. It currently has 68 cosponsors in the Senate, from across the political spectrum, which would be enough to pass if it were brought to a vote.

Who opposes it?

The ACLU, the Electronic Frontier Foundation and other groups supporting free speech are concerned it would violate the First Amendment. Even with the revisions that stripped state attorneys general from enforcing its duty of care provision, EFF calls it a “dangerous and unconstitutional censorship bill that would empower state officials to target services and online content they do not like.”

Kate Ruane, director of the Free Expression Project at the nonprofit Center for Democracy and Technology, said she remains concerned that the bill’s care of duty provision can be “misused by politically motivated actors to target marginalized communities like the LGBTQ population and just politically divisive information generally,” to try to suppress information because someone believes it is harmful to kids’ mental health.

She added that while these worries remain, there has been progress in reducing concerns.

The bigger issue, though, she added, is that platforms don’t want to get sued for showing minors content that could be “politically divisive,” so to make sure this doesn’t happen they could suppress such topics — about abortion or transgender healthcare or even the wars in Gaza or Ukraine.

Sen. Rand Paul, R-K.Y., has also expressed opposition to the bill. Paul said the bill “could prevent kids from watching PGA golf or the Super Bowl on social media because of gambling and beer ads, those kids could just turn on the TV and see those exact same ads.”

He added he has “tried to work with the authors to fix the bill’s many deficiencies. If the authors are not interested in compromise, Senator (Chuck) Schumer can bring the bill to the floor, as he could have done from the beginning.”

Will it pass Congress?

Golin said he is “very hopeful” that the bill will come to a vote in July.

“The reason it has it has not come to a vote yet is that passing legislation is really hard, particularly when you’re trying to regulate one of the, if not the most powerful industry in the world,” he said. “We are outspent.”

Golin added he thinks there’s a “really good chance” and he remains very hopeful that it will get passed.

Senate Majority Leader Schumer, D-N.Y., who has come out in support of KOSA, would have to bring it to a vote.

Schumer has backed the legislation but has not yet set aside floor time to pass it. Because there are objections to the legislation, it would take a week or more of procedural votes before a final vote.

He said on the floor last week that passing the bill is a “top priority” but that it had not yet moved because of the objections.

“Sadly, a few of our colleagues continue to block these bills without offering any constructive ideas for how to revise the text,” he said. “So now we must look ahead, and all options are on the table.”

❌
❌