Join our pioneering initiative at the convergence of AI, neuroscience, and the future of brain enhancement. We're tackling a few fundamental questions: what happens when we bridge the artificial-biological intelligence gap from both directions? Ubiquitous circuit motifs and features in the biological brain have already been used in AI systems such as distributed representation and skip connections, but can other motifs in the brain further enhance AI systems and make them more like biological brains?
We're building a novel "backbone" brain model that incorporates repeating circuit motifs in the biological brain—especially attractor networks—that have been evolutionarily conserved across all species yet only recently become computationally feasible to scale in AI systems. This creates a unique middle ground: more interpretable and adaptable than current AI systems while more capable than traditional neuroscience models.
What further distinguishes this project is our dual-purpose approach: as we develop increasingly capable brain-like systems, we simultaneously use them as testbeds for virtual brain-computer interface (BCI) enhancement through representation-level knowledge distillation. Starting with fundamental tasks like navigation and foraging, we systematically progress to complex cognitive challenges, rigorously benchmarking both baseline performance and enhancement gains. This will test if a 'brain' model can acquire knowledge directly from a pre-trained AI model via closed-loop BCI, bypassing learning. Our effort has the potential to transform our understanding of intelligence while establishing practical foundations for future cognitive enhancement technologies.
Primary Responsibilities:
Simulation Environment and task development
- Design and implement environments, assets, tasks and leverage GPU-based parallelization for efficient and high-fidelity 3D simulations (e.g., using NVIDIA Isaac Sim/Lab or ManiSkill), starting with spatial navigation tasks and subsequently adding cognitive tasks.
- Develop visualization tools to analyze model behavior, capturing sensory-motor interactions and neural activity as episodic trajectories. Systematically collect, organize, and catalog these trajectories across a diverse range of simulation tasks.
- Train basic deep learning models in a sensory-motor loop to finish some basic navigation tasks in a local machine.
Benchmarking & Evaluation Framework
- Develop comprehensive benchmarking protocols and evaluation metrics to assess the performance of brain-inspired models against standard AI baselines.
- Apply the developed metrics to quantify performance improvements derived from BCI enhancements integrated into brain-inspired models.
Cloud-GPU for model enhancement
- Utilize synthetic datasets generated from virtual simulations to train models offline. You will leverage on advanced H100 GPU clusters to achieve efficient scaling.
Additional Responsibilities
You will support and collaborate on these areas with the broader team.
Brain-Inspired Neural Architecture Development
- Contribute to the development of sensory and motor modules using standard deep learning frameworks.
- Support the design and implementation of spatial, episodic memory, working memory, and evaluation/planning modules using brain-like attractor network architectures.
Qualifications:
Required Experience:
- PhD in AI, Robotics, Computational Neuroscience, Computer Science, or equivalent practical experience
- Strong programming skills with Python, PyTorch or other deep learning frameworks
- Expertise with state-of-the-art simulation and RL training frameworks (such as NVIDIA Isaac Sim/Lab or ManiSkill)
- Experience with training models on distributed GPU systems (e.g., H100 cluster)
Preferred Experience:
- Knowledge of computational neuroscience and modelling neural circuits
- Experience in modeling attractor networks
Location & Start Date:
- In-person or hybrid in Emeryville, California.
- A start date of April 15 - May 31 is preferred
We welcome applications from candidates authorized to work in the US, and we provide visa sponsorship (including H-1B cap-exempt and F-1 STEM OPT) for qualified candidates.
Compensation & Benefits:
$180,000-$250,000 annually, commensurate with experience . We also offer benefits including:
- Health Insurance (inc. medical, dental, and vision) for employee and dependents
- Family leave (up to 12 weeks)
- A flexible time off policy
- 403(b) plan
Why Work Here?:
This role offers a unique opportunity to shape the exciting intersection of artificial intelligence and neuroscience, contributing directly to a high-risk, high-reward project with immense real-world impact. You’ll collaborate closely with Dr. Chongxi Lai—a pioneering neuroscientist and engineer who developed the world’s first hippocampal map-based brain-machine interface (BMI) to study thinking. He also extensively studies cognitive processes in both biological brains and AI systems. By advancing cutting-edge brain simulations and cognitive enhancement strategies, this project aims, at minimum, to bridge the gap between neuroscience and artificial intelligence, fundamentally enriching our understanding of intelligence itself. At its highest ambition, breakthroughs from this work may provide critical insights into animal and human cognitive enhancement.
Great timing and broad Influence
We are at a pivotal moment where building advanced brain-like models and capturing its neural activity alongside behavioral data in large-scale, GPU-accelerated 3D simulated environments has just become achievable. This work is important for embodied AI, robotics, computational neuroscience, cognitive science, and brain-computer interface (BCI) enhancement. There's a strong likelihood these fields will converge significantly over the next decade. Your contributions will profoundly influence these intersecting disciplines. Our project positions you at the forefront, empowering you to shape the future of this exciting convergence.
Supportive environment
At Astera Institute, research scientists and engineers work in a highly supportive environment that encourages innovation, with competitive salaries and sufficient funding to quickly test and develop groundbreaking ideas.
Computational Resources
We have local RTX-4090/5090 and H200 GPUs for simulation. Additionally, we have access to a cluster of 24,000 H100 GPUs via Voltage Park, enabling large-scale model training.
Compensation Range: $180K - $250K