Posts by Collection

portfolio

publications

Using Artificial Populations to Study Psychological Phenomena in Neural Models

Published in AAAI '24, 2024

We leverage work in uncertainty estimation in a novel approach to efficiently construct experimental populations. The resultant tool, PopulationLM, has been made open source. We provide theoretical grounding in the uncertainty estimation literature and motivation from current cognitive work regarding language models.

Recommended citation: Roberts, Jesse, et al. 'Using Artificial Populations to Study Psychological Phenomena in Neural Models.' Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 38. No. 17. 2024. https://arxiv.org/abs/2308.08032

How Powerful are Decoder-Only Transformer Neural Models?

Published in IJCNN '24, 2024

This is the first work to directly address the Turing completeness of the underlying technology employed in GPT-x as past work has focused on the more expressive, full auto-encoder transformer architecture. From this theoretical analysis, we show that the sparsity/compressibility of the word embedding is an important consideration for Turing completeness to hold.

Recommended citation: Roberts, Jesse. "How Powerful are Decoder-Only Transformer Neural Models?" arXiv preprint arXiv:2305.17026 (2023). https://arxiv.org/abs/2305.17026

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.