4 reasons to use open-source LLMs (especially after the OpenAI drama)

Alongside the market for closed-source LLMs like ChatGPT, an impressive array of open-source models has emerged. For enterprises, these language models is becoming increasingly compelling.

Understanding the impact of open-source language models

Cerebras Systems CEO Andrew Feldman explains the impact of open-source large language models (LLM) on the broader AI community.

What we learned from the deep learning revolution

Neuroscientist Terry Sejnowski discusses the early struggles of deep learning, its explosion into the mainstream, and the lessons learned from decades of research and development.

AI21 Labs’ mission to make large language models get their facts...

AI21 Labs chief scientist Yoav Levine explains how retrieval augmented language modeling can solve one of the biggest problems of LLMs.

Current applications and future opportunities of VR in healthcare

Needle procedures evoke fear and anxiety from your first trip to the doctor’s office. However, VR is changing that and reshaping the very essence of medical treatments.

The science of (artificial) intelligence

A science of artificial intelligence based on discredited epistemological foundations is doomed to fail, just as the attempt to build other science on them has failed.  Scientific progress depends on better understanding of the phenomena, not just producing slight improvements in some well-defined task.

Self-assembling neural networks can open new directions for AI research

A new software architecture uses neural development programs (NDP) to self-assemble deep learning models from basic units, like their biological counterparts.

How OpenAI uses network effects to protect ChatGPT’s market

ChatGPT is facing severe competition from other closed- and open-source LLMs. Here is how OpenAI is using network effects to solidify its market share.

How to make your LLMs lighter with GPTQ quantization

Quantization reduces the size of large language models considerably. GPTQ is a popular quantization method that is supported by Hugging Face and applies to many LLMs.

Emotional prompts enhance language models, study finds

A study by Microsoft and other institutions shows that adding emotional stimuli to prompts enhances LLMs like ChatGPT.

4 reasons to use open-source LLMs (especially after the OpenAI drama)

Alongside the market for closed-source LLMs like ChatGPT, an impressive array of open-source models has emerged. For enterprises, these language models is becoming increasingly compelling.

No-code retrieval augmented generation (RAG) with LlamaIndex and ChatGPT

Retrieval augmented generation (RAG) enables you to use custom documents with LLMs to improve their precision.

How to make your LLMs lighter with GPTQ quantization

Quantization reduces the size of large language models considerably. GPTQ is a popular quantization method that is supported by Hugging Face and applies to many LLMs.