NCU SmartLLM (FP32) — Fine-Tuned Mistral-7B

Downloads License: Apache-2.0

A full-precision, fully fine-tuned version of Mistral-7B on domain-specific data from The NorthCap University (NCU) — fine-tuned to answer academic, administrative, and general university queries in an instructional format.


Model Summary

  • Base Model: Mistral-7B
  • Fine-tuning: Full fine-tuning (FP32)
  • Architecture: Transformer Decoder (Causal LM)
  • Dataset: NCU-specific instructions (approx. 1,100 records)
  • Precision: Full precision (FP32)
  • Trained on: Google Colab (T4 GPU, 4 Epochs)

Capabilities

The model has been trained to:

  • Answer FAQs related to NCU (hostel, fees, scholarships, re-evaluation, etc.)
  • Act as an academic assistant for students
  • Handle general admin-related queries
  • Demonstrate Mistral's capabilities in an Indian academic context

Fully plug-and-play with Hugging Face Transformers & inference APIs


Usage

Inference (Transformers)

pip install transformers accelerate
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("pranav2711/ncu-smartllm-fp32", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("pranav2711/ncu-smartllm-fp32")

prompt = "### Question:\nHow do I apply for hostel at NCU?\n\n### Answer:"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Model Training Details

Detail Value
Base model mistralai/Mistral-7B-v0.1
Fine-tuned epochs 10
Batch size 2
Tokenizer max len 512
Output format Instruction → Response
Loss (final epoch) ~1.405300

Dataset Format (Instruction-Tuned)

Each entry follows:

{
  "instruction": "How do I apply for a degree certificate?",
  "input": "I graduated in 2023.",
  "output": "You can apply for the degree certificate through the Registrar's Office. Submit your documents along with..."
}

Formatted as:

### Question:
[instruction + optional input]

### Answer:
[output]

Model Applications

  • College ERP chatbots
  • Student helpdesks
  • Admission or exam queries
  • EdTech integrations for personalized response systems

License

This model is released under the Apache 2.0 license. You are free to use, modify, and distribute it with attribution.


Author & Maintainer

  • Pranav Singh
  • The NorthCap University (NCU), Gurugram
  • For collaboration: ping on Hugging Face or GitHub

How to Support

  • Star this model on Hugging Face
  • Try it on Spaces and share your feedback
  • Contribute improvements or suggest datasets

Downloads last month
4
Safetensors
Model size
3.86B params
Tensor type
F32
·
F16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for pranav2711/ncu-smartllm-fp32

Finetuned
(250)
this model