Paper

  • Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds

    Yongyi Mao, Ziqiao Wang p49513-49541 from Advances in Neural Information Processing Systems 36
    Our Price: $0.00
  • Sample-Efficient Agnostic Boosting

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Agnostic Boosting

    Udaya Ghai, Karan Singh p100571-100603 from Advances in Neural Information Processing Systems 37
    Our Price: $0.00
  • Sample-Efficient and Safe Deep Reinforcement Learning via Reset Deep Ensemble Agents

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient and Safe Deep Reinforcement Learning via Reset Deep Ensemble Agents

    Woojun Kim, Jongeui Park, Yongjae Shin, Youngchul Sung p53239-53260 from Advances in Neural Information Processing Systems 36
    Our Price: $0.00
  • Sample-efficient Bayesian Optimisation Using Known Invariances

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-efficient Bayesian Optimisation Using Known Invariances

    Ilija Bogunovic, Theodore Brown, Alexandru Cioba p47931-47965 from Advances in Neural Information Processing Systems 37
    Our Price: $0.00
  • Sample-Efficient Constrained Reinforcement Learning with General Parameterization

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Constrained Reinforcement Learning with General Parameterization

    Vaneet Aggarwal, Washim Mondal p68380-68405 from Advances in Neural Information Processing Systems 37
    Our Price: $0.00
  • Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization

    Ipsita Ghosh, Christian Kümmerle, Abiy Tasissa p77226-77268 from Advances in Neural Information Processing Systems 37
    Our Price: $0.00
  • Sample-Efficient Learning of Correlated Equilibria in Extensive-Form Games

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Learning of Correlated Equilibria in Extensive-Form Games

    Yu Bai, Song Mei, Ziang Song p4099-4110 from Advances in Neural Information Processing Systems 35
    Our Price: $0.00
  • Sample-efficient Multi-objective Molecular Optimization with GFlowNets

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-efficient Multi-objective Molecular Optimization with GFlowNets

    Tingjun Hou, Kim Hsieh, Chaowen Hu, Jian Wu, Jialu Wu, Jiahuan Yan, Yiheng Zhu p79667-79684 from Advances in Neural Information Processing Systems 36
    Our Price: $0.00
  • Sample-Efficient Private Learning of Mixtures of Gaussians

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Private Learning of Mixtures of Gaussians

    Hassan Ashtiani, Mahbod Majid, Shyam Narayanan p98994-99039 from Advances in Neural Information Processing Systems 37
    Our Price: $0.00
  • Sample-Efficient Reinforcement Learning of Partially Observable Markov Games

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Efficient Reinforcement Learning of Partially Observable Markov Games

    Chi Jin, Qinghua Liu, Csaba Szepesvari p18296-18308 from Advances in Neural Information Processing Systems 35
    Our Price: $0.00
  • Sample-Then-Optimize Batch Neural Thompson Sampling

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sample-Then-Optimize Batch Neural Thompson Sampling

    Zhongxiang Dai, Patrick Jaillet, Bryan Kian Hsiang Low, Yao Shu p23331-23344 from Advances in Neural Information Processing Systems 35
    Our Price: $0.00
  • Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent

    Neural Information Processing Systems Foundation, Inc. (NeurIPS)

    Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent

    Javier Antorán, José Miguel Hernández-Lobato, David Janz, Jihao Andreas Lin, Shreyas Padhy, Alexander Terenin p36886-36912 from Advances in Neural Information Processing Systems 36
    Our Price: $0.00