Unfortunately the quality of text extraction in GPT-4 Vision (shortly, GPT-4V) is not as well as the one of today’s state-of-the-art (SOTA) OCR model. This post introduce how to improve text extraction quality in GPT-4V with the help of OCR model.
System Message (Prompt) Example for Safety – ChatGPT
In this post, I’ll briefly show you risk mitigation architecture for adversarial prompting and prompt’s example for safety.
Implement Model Parallelism in LLMs
In this post, I will break down the techniques for scaling of large model’s training in a step-by-step manner.
Implement Advanced Reasoning in Semantic Kernel
In this blog post, I’ll show you how to implement custom Planner for advanced reasoning in Semantic Kernel. Reasoning will be a key to create an intelligent autonomous agent.
Image Processing in LLMs – TaskMatrix (Visual ChatGPT)
Visual ChatGPT (TaskMatrix) is one of interesting examples, in which visual information can be generated or replaced by interacting with OpenAI ChatGPT (LLM). This post shows you how it’s built on ReAct chain and reasoning.
ReAct (Reason+Act) prompting in LLMs
ReAct (Reasoning + Acting) is a flexible LLM chain framework and essential for today’s advanced LLM reasoning. LangChain helps you compose ReAct framework.
This post will give you the answer for the questions: “What is ReAct?”, “How ReAct works?”, and “How to build ReAct?”.
Hugging Face Fine-tune for Multilingual Question Answering (Japanese Example)
In this post, I’ll show you multilingual (Japanese) example for question-answering in Hugging Face.
This also describes how to configure practical QA system with the fine-tuned extractive QA models.
Hugging Face Fine-tune for Multilingual Summarization (Japanese Example)
In this post, I’ll show you multilingual (Japanese) example for text summarization in Hugging Face.
You can learn how to fine-tune multilingual transformer models in sequence-to-sequence tasks.
Hugging Face Fine-tune for Multilingual NER (Japanese Example)
In this post, I’ll show you fine-tuned NER (named entity recognition) classification example for Japanese language in Hugging Face. You can learn how to fine-tune multilingual transformer models in token classification tasks.
In the last part of this post, I’ll also optimize training with DeepSpeed which is well integrated with HuggingFace transformers.
Get Started with Optimization in Azure Quantum (Simulated Annealing)
In this post, I show you programming tutorial in Azure Quantum with a simple optimization (simulated annealing) example.