Back to Home

Publications

2023

Efficient Transformer Models for Resource-Constrained Devices

Smith, J., Johnson, A., Williams, B.

Proceedings of the Conference on Neural Information Processing Systems (NeurIPS)

This paper introduces a novel approach to optimizing transformer models for deployment on devices with limited computational resources. We demonstrate a 40% reduction in model size while maintaining 95% of the original performance.

2022

A Survey of Vision-Language Models: Capabilities, Limitations, and Future Directions

Smith, J., Chen, C., Garcia, M.

ACM Computing Surveys

This comprehensive survey examines the current state of vision-language models, analyzing their architectures, training methodologies, and applications. We identify key challenges and promising research directions.

Novel Approaches to Few-Shot Learning in Computer Vision

Smith, J., Park, L., Thompson, R.

IEEE Conference on Computer Vision and Pattern Recognition (CVPR)

Best Paper Award

This paper presents a new framework for few-shot learning that leverages meta-learning and self-supervised pretraining to achieve state-of-the-art results on benchmark datasets.

2021

Self-Supervised Learning for Medical Image Analysis

Smith, J., Lee, K., Patel, R.

IEEE Transactions on Medical Imaging

We propose a self-supervised learning approach for medical image analysis that reduces the need for large annotated datasets. Our method shows promising results on X-ray, MRI, and CT scan datasets.

Attention Mechanisms in Graph Neural Networks

Smith, J., Zhang, H., Brown, T.

International Conference on Machine Learning (ICML)

This paper introduces a novel attention mechanism for graph neural networks that improves performance on node classification and link prediction tasks across various domains.

2020

Robust Feature Learning for Adversarial Defense

Smith, J., Miller, S.

Advances in Neural Information Processing Systems (NeurIPS)

We present a novel approach to adversarial defense that focuses on learning robust features that are invariant to adversarial perturbations, achieving state-of-the-art robustness on benchmark datasets.

2019

Efficient Neural Architecture Search via Parameter Sharing

Smith, J., Wang, L., Davis, J.

International Conference on Learning Representations (ICLR)

This paper proposes a parameter sharing strategy for neural architecture search that significantly reduces computational requirements while maintaining competitive performance.