Wednesday, November 27, 2024

Foundations of Automation and Intelligence

 

Today I am going to talk a little about what is happening and what is about to happen. There is a lot of talk about AI and AGI, neither of which are actually fully formed yet. We are seeing the results of work done on Generative Pre-Trained Transformers and Large Language Models. They are closely mimicking the way the human mind uses some of it's memory by doing language prediction and reflection. There is more to the process humans use, than just memory and conversational text recall. 

ChatGPT and Bard are not models but Human Machine Interfaces. 

The models behind it are GPT-3.5, or GPT-4, for ChatGPT and PaLM/PaLM 2 for Bard.

Google is the world leader in Machine and Deep Learning. They have lead the way on the processes of Hardware TPU, Software TensorFlow, large language models PaLM 2, PaLM API, MakerSuite, Bard, Generative AI Studio amd Vertex AI, Gen App Builder, Phenaki, SEANet, Google DeepMind Gemini, .and much more. Google is behind the research on Transformers architectures. 

Transformer: A Novel Neural Network Architecture for Language Understanding

In “Attention Is All You Need” 2017, we introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well suited for language understanding.
https://ai.googleblog.com/2017/08/transformer-novel-neural-network.html

Ref: https://arxiv.org/abs/1706.03762 Attention Is All You Need
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin()

PaLM 2 is more powerful than GPT-4 but it has been restrained to avoid problems with justice.
https://github.com/Mooler0410/LLMsPracticalGuide

No comments: