NGFT LLMs bridge state-of-the-art AI models with aviation knowledge.
State-of-the-art large language models (LLMs) are optimized for general tasks and lack specific knowledge of the aviation industry. NGFT-AI models understand terms and procedures of aviation and can be used for conversation, or to power agentic workflows.
Our models 1 outperform comparable state-of-the-art in aviation (ATPL)- and legal-related (Definition Extraction) and general multilingual (Multilingual MMLU) evaluations.
ngft-Llama-8B (all-data) and ngft-Llama-8B (llm-data) are two versions of models trained on different data sets.
How do you guarantee the safety of my data?
We run NGFT-AI models on our own infrastructure, don't share your information with any third parties, and all interactions with our model are encrypted to ensure confidentiality. We also implement robust access controls and security measures to prevent unauthorized access to your data.
How is your model different from ChatGPT?
While ChatGPT is a powerful tool, it has been optimized for general tasks. Our custom model is tailored specifically for the needs of the aviation industry. This approach allows us to create a model that is not only powerful but also more proficient in the aviation context and is aligned with increased privacy and security considerations.
Can I use the custom AI model in my own applications?
Yes, our custom large language model is designed to integrate seamlessly with your existing systems and workflows. You can embed it directly into your applications or use APIs provided by us to communicate with the model from your software.
We will soon integrate NGFT AI models in our software products as well.
Can the model hallucinate responses (producing factually wrong claims)?
No AI model is completely safe from hallucinations. However, by providing the model with rich knowledege about aviation and legal aspects, the model exhibits measurably fewer hallucinations.
Combined with direct access to data (RAG) the model is grounded even further, reducing the chances of hallucinations significantly.
© 2024 NGFT Solutions AG