Tag: Architecture

EvoLMM

EvoLMM is a self-evolving framework for training large multimodal models using continuous self-rewarding processes.

Mamba-Transformer

A hybrid model architecture that combines the efficiency of state-space models with the expressivity of attention mechanisms.

Codec2Vec

Codec2Vec is a speech representation learning framework that uses discrete audio codec units for feature extraction.

Mamba-Attention

A hybrid attention mechanism that enhances the efficiency and performance of large language models.

Nemotron Elastic

A framework for building reasoning-oriented large language models that incorporates multiple nested submodels optimized for different deployment configurations.

AINA

AINA is a framework designed to learn robot manipulation policies from human demonstrations in natural environments.