<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <title>Secure multiparty computation</title>
    <link>https://www.amazon.science/tag/secure-multiparty-computation</link>
    <description>Secure multiparty computation</description>
    <language>en-US</language>
    <lastBuildDate>Thu, 08 Feb 2024 20:00:41 GMT</lastBuildDate>
    <atom:link href="https://www.amazon.science/tag/secure-multiparty-computation.rss" type="application/rss+xml" rel="self" />
    <item>
      <title>Amazon&amp;apos;s Tal Rabin wins Dijkstra Prize in Distributed Computing</title>
      <link>https://www.amazon.science/blog/amazons-tal-rabin-wins-dijkstra-prize-in-distributed-computing</link>
      <description>Prize honors Amazon senior principal scientist and Penn professor for a protocol that achieves a theoretical limit on information-theoretic secure multiparty computation.</description>
      <pubDate>Thu, 08 Feb 2024 20:00:41 GMT</pubDate>
      <guid>https://www.amazon.science/blog/amazons-tal-rabin-wins-dijkstra-prize-in-distributed-computing</guid>
    </item>
    <item>
      <title>Computing on private data</title>
      <link>https://www.amazon.science/blog/computing-on-private-data</link>
      <description>Both secure multiparty computation and differential privacy protect the privacy of data used in computation, but each has advantages in different contexts.</description>
      <pubDate>Thu, 01 Jun 2023 18:15:13 GMT</pubDate>
      <guid>https://www.amazon.science/blog/computing-on-private-data</guid>
    </item>
    <item>
      <title>Client-private secure aggregation for privacy preserving federated learning</title>
      <link>https://www.amazon.science/publications/client-private-secure-aggregation-for-privacy-preserving-federated-learning</link>
      <description>Privacy-preserving federated learning (PPFL) is a paradigm of distributed privacy-preserving machine learning training in which a set of clients, each holding siloed training data, jointly compute a shared global model under the orchestration of an aggregation server. The system has the property that no party learns any information about any client&amp;#8217;s training data, besides what could be inferred from the global model. The core cryptographic component of a PPFL scheme is the secure aggregation protocol, a secure multi-party computation protocol in which the server securely aggregates the clients&amp;#8217; locally trained models into an aggregated global model, which it distributes to the clients. However, in many applications the global model represents a trade secret of the consortium of clients, which they may not wish to reveal in the clear to the server. In this work, we propose a novel model of secure aggregation, called client-private secure aggregation (CPSA), in which the server computes an encrypted global model which only the clients can decrypt. We provide three explicit constructions of CPSA which exhibit varying trade-offs. We also conduct experimental results to demonstrate the practicality of our constructions in the cross-silo setting when scaled to 250 clients.</description>
      <pubDate>Tue, 22 Nov 2022 19:55:59 GMT</pubDate>
      <guid>https://www.amazon.science/publications/client-private-secure-aggregation-for-privacy-preserving-federated-learning</guid>
    </item>
    <item>
      <title>Privacy challenges in extreme gradient boosting</title>
      <link>https://www.amazon.science/latest-news/privacy-challenges-in-extreme-gradient-boosting</link>
      <description>Scientists describe the use of privacy-preserving machine learning to address privacy challenges in XGBoost training and prediction.</description>
      <pubDate>Tue, 22 Jun 2021 12:44:23 GMT</pubDate>
      <guid>https://www.amazon.science/latest-news/privacy-challenges-in-extreme-gradient-boosting</guid>
    </item>
    <item>
      <title>Cryptographic computing can accelerate the adoption of cloud computing</title>
      <link>https://www.amazon.science/academic-engagements/cryptographic-computing-can-accelerate-the-adoption-of-cloud-computing</link>
      <description>Amazon Scholar Joan Feigenbaum talks about two cryptographic techniques that are being used to address cloud-computing privacy concerns and accelerate enterprise cloud adoption.</description>
      <pubDate>Tue, 11 Feb 2020 16:52:05 GMT</pubDate>
      <guid>https://www.amazon.science/academic-engagements/cryptographic-computing-can-accelerate-the-adoption-of-cloud-computing</guid>
    </item>
  </channel>
</rss>
