File size: 443 Bytes
cba34d3
 
 
 
 
 
 
2adfb3f
 
248f538
 
 
 
 
 
a6e88ee
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: mit
language:
- en
base_model:
- microsoft/phi-4
pipeline_tag: question-answering
tags:
- unsloth
datasets:
- mlabonne/FineTome-100k
metrics:
- accuracy
new_version: microsoft/phi-4-gguf
library_name: diffusers
---

My First Huggingface Model - Default UnSloth phi4 template with LoRA fine tuner
Locally trained for around 2 hours, utilized around 16 GB RAM to store the data.
I also used 8 GB RAM to train the model with my GPU