FAQs


Q1: Are TAIHU and TAIDE the same?
No, they are not the same. TAIHU (TAIwan HUmanities Conversational AI Knowledge Discovery System) is a conversational AI system focused on knowledge discovery in the humanities, integrating large language models with Taiwan-specific humanities databases to support scholars in knowledge exploration and data retrieval. TAIDE, on the other hand, is an open language model development project led by Taiwan’s National Science and Technology Council (NSTC), focusing on foundational infrastructure and localization of language models. The two have different goals and application domains.
Q2: Is TAIHU a non-profit organization?
Yes, TAIHU itself operates as a non-profit organization.
Q3: Who is the executing institution behind TAIHU?
TAIHU is led by Professor Su-Ling Yeh from the Department of Psychology at National Taiwan University. It is executed in collaboration with various humanities research units and technical teams both within and outside the university. The project is currently funded by the NSTC, with Professor Yeh’s research team responsible for system development and research advancement.
Q4: What is the goal of TAIHU?
TAIHU aims to create a dialogue-based platform for humanities knowledge discovery, helping scholars overcome challenges such as data fragmentation and inefficient search processes. The ultimate goal is to enhance the accessibility, searchability, and applicability of Taiwan’s local humanities data, supporting digital transformation and research innovation in the humanities.
Q5: What is TAIHU’s mechanism?
TAIHU employs Retrieval-Augmented Generation (RAG) technology, integrating Taiwan-specific humanities databases and enabling multi-turn interactive knowledge exploration. The system features a fact verification mechanism that cites evidence sources for generated content, ensuring accuracy and reliability. It also includes localized evaluation protocols and user experience research processes to continually optimize system performance.
Q6: Does TAIHU train its own models?
TAIHU does not currently train foundational large language models (such as GPT) on its own. Instead, it enhances the performance of existing models in specific domains through RAG, prompt engineering, retrieval techniques, and system design. In the future, localized fine-tuning or dedicated model training may be considered depending on research development and resource availability.