ChemicalQDeviceHow to Code LLMs with Tensor Networks
How to Code LLMs Layer by Layer with Tensor Network Substitutions 04-04-24.pdf

Open-source Large Language Models (LLMs) can be fine-tuned using platforms such as Hugging Face and PyTorch. To address Generative AI issues in Medical, substitution of transformer components with tensor networks layer-by-layer is likely required to create more practical models. Tensor networks decompose higher dimensional data into smaller tensors for efficient processing on traditional computers. 

Here, 10 tensor network papers from the last three years relevant to re-coding transformers and large language models for higher utility are detailed. Common techniques such as Matrix product states, Matrix product operators, and Kronecker decompositions have enabled increased awareness of tensor network suitability for data science. Broader adoption of tensor networks will be realized near-term, as researchers have determined that models with tensor networks typically have greater explainability - a key area of improvement needed for Generative AI, as identified by Stanford HAI.

“Many commentators have declared the end of health care as we know it, given the apparent ability of LLMs to pass U.S. Medical Licensing Exams. But health care practice involves more than being able to answer a multiple choice test. It involves substantiating, explaining, and assessing claims with reliable, scientific sources. And on that score, GenAI still has a long way to go. ” - Wu, K., et al., Stanford University HAI, February 12, 2024. Source: https://hai.stanford.edu/news/generating-medical-errors-genai-and-erroneous-medical-references

Created by Kevin Kawchak Founder CEO ChemicalQDevice2024 San Diego, CaliforniaHealthcare Innovation