JB

Jeremy Blackburn

Assistant Professor of Computer Science at Binghamton University, State University of New York
On the record
Represented by:
Share profile 
Link:
Bio
Edit

Jeremy Blackburn joined the School of Computing at Binghamton University in fall 2019. Jeremy is broadly interested in data science, with a focus on large-scale measurements and modeling. His largest line of work is in understanding jerks on the Internet. His research into understanding toxic behavior, hate speech, and fringe and extremist Web communities has been covered in the press by The Washington Post, the New York Times, The Atlantic, The Wall Street Journal, the BBC and New Scientist, among others.

Employment
Sign up to view all
  • Expert Weighs In on X Outage: Cyberattack or Routine Glitch?
    Jeremy comments on the X outage, stating, "Elon Musk is not exactly what I would call a reliable narrator, but cyberattacks happen." He suggests it could also be a routine service failure, emphasizing that such disruptions occur naturally.
Recent Quotes
Sign up to view all
  • Google has refused to add fact-checks to YouTube videos and search results despite a new European Law. This news comes off the heels of Meta putting an end to fact-checking. Jeremy Blackburn, a computer scientist at Binghamton University, State University of New York who researches toxic behavior online, believes that community moderation can be a solution but the situation is complicated.

    "Zuckerberg directly mentioned that they are going to move to a program along the lines of what Twitter has (Community Notes)," said Blackburn. "Community Notes, and Birdwatch before it, have been relatively effective in my opinion. One of the more difficult problems with moderation is scaling, and systems that leverage an existing userbase are a pretty straightforward solution. In fact, they are likely better solutions than hard moderation policies (e.g., banning users or content) which my and others' work has shown often leads to the users creating their own alternative platforms and in some cases exhibiting worse behavior.

    Blackburn said at the same time, leaving it entirely up to the community to self-regulate is definitely going to have consequences.

    "While harnessing the wisdom of the crowd is efficient, it isn’t always accurate. There is also the risk that the mechanism itself is exploited. There is already some evidence of this on Twitter, with Community Notes being used for things past fact checking, including personal attacks and a mechanism to spread hate speech and conspiracy theories. The bottom line is that any moderation system will have holes, and Community Notes type systems give potential "attackers" direct access to moderation tools.

    He also thinks it’s impossible to discuss this situation without viewing it in the broader context of the world’s current socio-political concerns.

    "Tight (perhaps overly so) moderation practices have been the norm for several years now and have played major roles in world events. For example, consider Trump being banned from Twitter due to his behavior during the Jan. 6 insurrection. Considering the current political climate, not just in the U.S. but worldwide, it is likely in social platforms’ best interest to enforce policies that are in congruence with the incoming administration, regardless of any other factors. I think it could be argued that this is actually an existential concern for large social media platforms."

  • Shutting down social media platforms somewhat effective in curbing hate speech, but not a long-term solution

    While deplatforming (shutting down social media platforms) can be effective in reducing users and content produced, it’s not a long-term solution for what is a very complex issue, according to Jeremy Blackburn, assistant professor of computer science at Binghamton University, State University of New York.

    Platform banning can reduce growth of new users over time, and there is less content produced overall, said Blackburn, On the other hand, migrations do happen, and this is often a response to real world events – for example, a deplatformed personality who migrates to a new platform can trigger an influx of new users.

    “Ultimately, it’s unlikely that deplatforming, while certainly easy to implement and effective to some extent, will be a long-term solution in and of itself,” said Blackburn. “Moving forward, effective approaches will need to take into account the complicated technological and social consequences of addressing the root problem of extremist and violent Web communities.”

  • 120 million Parler posts reveal users shared content related to Donald Trump’s efforts to challenge election

    In recent news, archivists have saved content by users on the social media platform Parler, which was booted by big tech companies like Apple and Amazon. Those posts largely revolve around support for Donald Trump and his efforts during the 2020 election, according to new research co-conducted by Jeremy Blackburn, assistant professor of computer science at Binghamton University, State University of New York.

    Blackburn and fellow researchers performed the first-ever data-driven characterization of Parler, analyzing posts and metadata for users that joined the platform between 2018 and 2020. The researchers analyzed 120 million posts, revealing what users post most often about.

    “We found that Parler users share content related to US politics, content that show support to Donald Trump and his efforts during the 2020 US elections, and content related to conspiracy theories,” said Blackburn. “Parler attracts the interest of conservatives, Trump supporters, religious, and patriot individuals.”

Headshots
Popularity