Text Generation
Transformers
Safetensors
43 datasets
14 languages
mistral
mergekit
Merge
Mistral_Star
Mistral_Quiet
Mistral
Mixtral
Question-Answer
Token-Classification
Sequence-Classification
SpydazWeb-AI
chemistry
biology
legal
code
climate
medical
LCARS_AI_StarTrek_Computer
text-generation-inference
chain-of-thought
tree-of-knowledge
forest-of-thoughts
visual-spacial-sketchpad
alpha-mind
knowledge-graph
entity-detection
encyclopedia
wikipedia
stack-exchange
Reddit
Cyber-series
MegaMind
Cybertron
SpydazWeb
Spydaz
LCARS
star-trek
mega-transformers
Mulit-Mega-Merge
Multi-Lingual
Afro-Centric
African-Model
Ancient-One
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -1,52 +1,158 @@
|
|
1 |
---
|
2 |
-
base_model:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
library_name: transformers
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
tags:
|
5 |
- mergekit
|
6 |
- merge
|
7 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
---
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
dtype: float16
|
51 |
-
|
52 |
-
```
|
|
|
1 |
---
|
2 |
+
base_model:
|
3 |
+
- LeroyDyer/SpydazWeb_AI_CyberTron_Ultra_7b
|
4 |
+
- LeroyDyer/LCARS_AI_StarTrek_Computer
|
5 |
+
- LeroyDyer/_Spydaz_Web_AI_ActionQA_Project
|
6 |
+
- LeroyDyer/_Spydaz_Web_AI_ChatML_512K_Project
|
7 |
+
- LeroyDyer/SpyazWeb_AI_DeepMind_Project
|
8 |
+
- LeroyDyer/SpydazWeb_AI_Swahili_Project
|
9 |
+
- LeroyDyer/_Spydaz_Web_AI_08
|
10 |
+
- LeroyDyer/_Spydaz_Web_AI_ChatQA_001
|
11 |
+
- LeroyDyer/_Spydaz_Web_AI_ChatQA_001_SFT
|
12 |
+
- LeroyDyer/_Spydaz_Web_AI_ChatQA_001_UFT
|
13 |
+
- LeroyDyer/_Spydaz_Web_AI_010
|
14 |
library_name: transformers
|
15 |
+
language:
|
16 |
+
- en
|
17 |
+
- sw
|
18 |
+
- ig
|
19 |
+
- so
|
20 |
+
- es
|
21 |
+
- ca
|
22 |
+
- xh
|
23 |
+
- zu
|
24 |
+
- ha
|
25 |
+
- tw
|
26 |
+
- af
|
27 |
+
- hi
|
28 |
+
- bm
|
29 |
+
- su
|
30 |
+
datasets:
|
31 |
+
- gretelai/synthetic_text_to_sql
|
32 |
+
- HuggingFaceTB/cosmopedia
|
33 |
+
- teknium/OpenHermes-2.5
|
34 |
+
- Open-Orca/SlimOrca
|
35 |
+
- Open-Orca/OpenOrca
|
36 |
+
- cognitivecomputations/dolphin-coder
|
37 |
+
- databricks/databricks-dolly-15k
|
38 |
+
- yahma/alpaca-cleaned
|
39 |
+
- uonlp/CulturaX
|
40 |
+
- mwitiderrick/SwahiliPlatypus
|
41 |
+
- swahili
|
42 |
+
- Rogendo/English-Swahili-Sentence-Pairs
|
43 |
+
- ise-uiuc/Magicoder-Evol-Instruct-110K
|
44 |
+
- meta-math/MetaMathQA
|
45 |
+
- abacusai/ARC_DPO_FewShot
|
46 |
+
- abacusai/MetaMath_DPO_FewShot
|
47 |
+
- abacusai/HellaSwag_DPO_FewShot
|
48 |
+
- HaltiaAI/Her-The-Movie-Samantha-and-Theodore-Dataset
|
49 |
+
- HuggingFaceFW/fineweb
|
50 |
+
- occiglot/occiglot-fineweb-v0.5
|
51 |
+
- omi-health/medical-dialogue-to-soap-summary
|
52 |
+
- keivalya/MedQuad-MedicalQnADataset
|
53 |
+
- ruslanmv/ai-medical-dataset
|
54 |
+
- Shekswess/medical_llama3_instruct_dataset_short
|
55 |
+
- ShenRuililin/MedicalQnA
|
56 |
+
- virattt/financial-qa-10K
|
57 |
+
- PatronusAI/financebench
|
58 |
+
- takala/financial_phrasebank
|
59 |
+
- Replete-AI/code_bagel
|
60 |
+
- athirdpath/DPO_Pairs-Roleplay-Alpaca-NSFW
|
61 |
+
- IlyaGusev/gpt_roleplay_realm
|
62 |
+
- rickRossie/bluemoon_roleplay_chat_data_300k_messages
|
63 |
+
- jtatman/hypnosis_dataset
|
64 |
+
- Hypersniper/philosophy_dialogue
|
65 |
+
- Locutusque/function-calling-chatml
|
66 |
+
- bible-nlp/biblenlp-corpus
|
67 |
+
- DatadudeDev/Bible
|
68 |
+
- Helsinki-NLP/bible_para
|
69 |
+
- HausaNLP/AfriSenti-Twitter
|
70 |
+
- aixsatoshi/Chat-with-cosmopedia
|
71 |
+
- xz56/react-llama
|
72 |
+
- BeIR/hotpotqa
|
73 |
+
- YBXL/medical_book_train_filtered
|
74 |
tags:
|
75 |
- mergekit
|
76 |
- merge
|
77 |
+
- Mistral_Star
|
78 |
+
- Mistral_Quiet
|
79 |
+
- Mistral
|
80 |
+
- Mixtral
|
81 |
+
- Question-Answer
|
82 |
+
- Token-Classification
|
83 |
+
- Sequence-Classification
|
84 |
+
- SpydazWeb-AI
|
85 |
+
- chemistry
|
86 |
+
- biology
|
87 |
+
- legal
|
88 |
+
- code
|
89 |
+
- climate
|
90 |
+
- medical
|
91 |
+
- LCARS_AI_StarTrek_Computer
|
92 |
+
- text-generation-inference
|
93 |
+
- chain-of-thought
|
94 |
+
- tree-of-knowledge
|
95 |
+
- forest-of-thoughts
|
96 |
+
- visual-spacial-sketchpad
|
97 |
+
- alpha-mind
|
98 |
+
- knowledge-graph
|
99 |
+
- entity-detection
|
100 |
+
- encyclopedia
|
101 |
+
- wikipedia
|
102 |
+
- stack-exchange
|
103 |
+
- Reddit
|
104 |
+
- Cyber-series
|
105 |
+
- MegaMind
|
106 |
+
- Cybertron
|
107 |
+
- SpydazWeb
|
108 |
+
- Spydaz
|
109 |
+
- LCARS
|
110 |
+
- star-trek
|
111 |
+
- mega-transformers
|
112 |
+
- Mulit-Mega-Merge
|
113 |
+
- Multi-Lingual
|
114 |
+
- Afro-Centric
|
115 |
+
- African-Model
|
116 |
+
- Ancient-One
|
117 |
---
|
118 |
+
|
119 |
+
|
120 |
+
Quote for Motivation:
|
121 |
+
# "Success comes from defining each task in achievable steps. Every completed step is a success that brings you closer to your goal. If your steps are unreachable, failure is inevitable. Winners create more winners, while losers do the opposite. Success is a game of winners!"
|
122 |
+
|
123 |
+
— # Leroy Dyer (1972-Present)
|
124 |
+
<img src="https://cdn-avatars.huggingface.co/v1/production/uploads/65d883893a52cd9bcd8ab7cf/tRsCJlHNZo1D02kBTmfy9.jpeg" width="300"/>
|
125 |
+
|
126 |
+
# "To grow as a professional, set goals just beyond your current abilities. Achieving these milestones will not only overcome obstacles but also strengthen your skillset. If your tasks are too easy, you’ll never challenge yourself or improve, and life will pass you by!"
|
127 |
+
|
128 |
+
|
129 |
+
This model is based on the worlds archive of knowledge maintaining historical documents and providing services for the survivors of mankind , who may need to construct shelters develop technologys , or medical resources as well as maintain the history of the past . keeping store of all the religious knowledge and data of the world: A friendly interface with a personality caring and flirtatious at times : non binary !... and Expert in all feilds: ie Uncensored and will not refuse to give information : the model can be used for role play as many character dialogues were als trained into the model as its personality to enable a greater perspective and outlook and natural discussion with the agents: the model was trained to operateinaragenvironment utilizing content and internal knowledge to respond to questions or create enriched sumarys.
|
130 |
+
|
131 |
+
|
132 |
+
|
133 |
+
## Training Reginmes:
|
134 |
+
* Alpaca
|
135 |
+
* ChatML / OpenAI / MistralAI
|
136 |
+
* Text Generation
|
137 |
+
* Question/Answer (Chat)
|
138 |
+
* Planner
|
139 |
+
* Instruction/Input/Response (instruct)
|
140 |
+
* Mistral Standard Prompt
|
141 |
+
* Translation Tasks
|
142 |
+
* Entitys / Topic detection
|
143 |
+
* Book recall
|
144 |
+
* Coding challenges, Code Feedback, Code Sumarization, Commenting Code, code planning and explanation: Software generation tasks
|
145 |
+
* Agent Ranking and response anyalisis
|
146 |
+
* Medical tasks
|
147 |
+
* PubMed
|
148 |
+
* Diagnosis
|
149 |
+
* Psychaitry
|
150 |
+
* Counselling
|
151 |
+
* Life Coaching
|
152 |
+
* Note taking
|
153 |
+
* Medical smiles
|
154 |
+
* Medical Reporting
|
155 |
+
* Virtual laboritys simulations
|
156 |
+
* Chain of thoughts methods
|
157 |
+
* One shot / Multi shot prompting tasks
|
158 |
+
|
|
|
|
|
|