Smith, J., Johnson, A., Williams, B.
Proceedings of the Conference on Neural Information Processing Systems (NeurIPS)
This paper introduces a novel approach to optimizing transformer models for deployment on devices with limited computational resources. We demonstrate a 40% reduction in model size while maintaining 95% of the original performance.
Smith, J., Chen, C., Garcia, M.
ACM Computing Surveys
This comprehensive survey examines the current state of vision-language models, analyzing their architectures, training methodologies, and applications. We identify key challenges and promising research directions.
Smith, J., Park, L., Thompson, R.
IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Best Paper Award
This paper presents a new framework for few-shot learning that leverages meta-learning and self-supervised pretraining to achieve state-of-the-art results on benchmark datasets.
Smith, J., Lee, K., Patel, R.
IEEE Transactions on Medical Imaging
We propose a self-supervised learning approach for medical image analysis that reduces the need for large annotated datasets. Our method shows promising results on X-ray, MRI, and CT scan datasets.
Smith, J., Zhang, H., Brown, T.
International Conference on Machine Learning (ICML)
This paper introduces a novel attention mechanism for graph neural networks that improves performance on node classification and link prediction tasks across various domains.
Smith, J., Miller, S.
Advances in Neural Information Processing Systems (NeurIPS)
We present a novel approach to adversarial defense that focuses on learning robust features that are invariant to adversarial perturbations, achieving state-of-the-art robustness on benchmark datasets.
Smith, J., Wang, L., Davis, J.
International Conference on Learning Representations (ICLR)
This paper proposes a parameter sharing strategy for neural architecture search that significantly reduces computational requirements while maintaining competitive performance.