Consult the module API page at
https://engineering.purdue.edu/kak/distBabyGPT/babyGPT-1.0.8.html
for all information related to this module, including information related
to the latest changes to the code. The page at the URL shown above lists
all of the module functionality you can invoke in your own code.
::
Creating an instance of babyGPT:
baby_gpt = babyGPT(
max_seq_length = max_seq_length,
batch_size = batch_size,
embedding_size = embedding_size,
num_basic_decoders = num_basic_decoders,
num_atten_heads = num_atten_heads,
optimizer_params = optimizer_params,
num_warmup_steps = num_warmup_steps,
masking = masking,
verify_text_corpus = False,
path_saved_model = {"decoder" : "./saved_decoder",
"embedding_generator" : "./saved_embedding_generator",
},
)
Since babyGPT calls on TransformerFG for language modeling, you must also construct an instance of that class:
xformer = baby_gpt.TransformerFG(
max_seq_length = max_seq_length,
embedding_size = embedding_size,
tokenizer_json = tokenizer_json,
num_warmup_steps = num_warmup_steps,
optimizer_params = optimizer_params,
)
Within the TransformerFG module, it is the MasterDecoder class that is needed for the next token prediction for the purpose of self-supervised learning:
master_decoder = baby_gpt.MasterDecoderWithMasking(
xformer,
num_basic_decoders = num_basic_decoders,
num_atten_heads = num_atten_heads,
masking = masking
)
Finally, here is an instance of the dataloader you're going to need:
dataloader = baby_gpt.ArticleDatasetWithBufferedContext(
gpt = baby_gpt,
tokenizer_json = tokenizer_json,
context_window_size = context_window_size,
context_buffer_size = context_buffer_size,
articles_dir = articles_dir,
)
Raw data
{
"_id": null,
"home_page": "https://engineering.purdue.edu/kak/distBabyGPT/babyGPT-1.0.8.html",
"name": "babyGPT",
"maintainer": "Avinash Kak",
"docs_url": null,
"requires_python": null,
"maintainer_email": "kak@purdue.edu",
"keywords": "language modeling, unsupervised learning",
"author": "Avinash Kak",
"author_email": "kak@purdue.edu",
"download_url": "https://files.pythonhosted.org/packages/48/14/bfd8b181f9ec6c8b9e950cd57d6ac69e936e2f8295b6684ea00e8967c37d/babyGPT-1.0.8.tar.gz",
"platform": "All platforms",
"description": "\n\nConsult the module API page at\n\n https://engineering.purdue.edu/kak/distBabyGPT/babyGPT-1.0.8.html\n\nfor all information related to this module, including information related\nto the latest changes to the code. The page at the URL shown above lists\nall of the module functionality you can invoke in your own code.\n\n::\n\n Creating an instance of babyGPT:\n\n baby_gpt = babyGPT(\n max_seq_length = max_seq_length,\n batch_size = batch_size,\n embedding_size = embedding_size,\n num_basic_decoders = num_basic_decoders,\n num_atten_heads = num_atten_heads,\n optimizer_params = optimizer_params,\n num_warmup_steps = num_warmup_steps,\n masking = masking,\n verify_text_corpus = False,\n path_saved_model = {\"decoder\" : \"./saved_decoder\", \n \"embedding_generator\" : \"./saved_embedding_generator\", \n },\n )\n\n Since babyGPT calls on TransformerFG for language modeling, you must also construct an instance of that class:\n \n xformer = baby_gpt.TransformerFG( \n max_seq_length = max_seq_length,\n embedding_size = embedding_size,\n tokenizer_json = tokenizer_json,\n num_warmup_steps = num_warmup_steps,\n optimizer_params = optimizer_params,\n )\n\n Within the TransformerFG module, it is the MasterDecoder class that is needed for the next token prediction for the purpose of self-supervised learning:\n \n master_decoder = baby_gpt.MasterDecoderWithMasking(\n xformer, \n num_basic_decoders = num_basic_decoders,\n num_atten_heads = num_atten_heads,\n masking = masking\n )\n \n\n Finally, here is an instance of the dataloader you're going to need:\n\n dataloader = baby_gpt.ArticleDatasetWithBufferedContext(\n gpt = baby_gpt,\n tokenizer_json = tokenizer_json,\n context_window_size = context_window_size,\n context_buffer_size = context_buffer_size,\n articles_dir = articles_dir,\n )\n \n",
"bugtrack_url": null,
"license": "Python Software Foundation License",
"summary": "An educational module for experimenting with unsupervised learning in large language modeling",
"version": "1.0.8",
"project_urls": {
"Download": "https://engineering.purdue.edu/kak/distBabyGPT/babyGPT-1.0.8.tar.gz",
"Homepage": "https://engineering.purdue.edu/kak/distBabyGPT/babyGPT-1.0.8.html"
},
"split_keywords": [
"language modeling",
" unsupervised learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4814bfd8b181f9ec6c8b9e950cd57d6ac69e936e2f8295b6684ea00e8967c37d",
"md5": "4567b750ffbcd6dfbcc11de4498ad0c7",
"sha256": "42bcee894281f58fd6a07b2033c0288a5889a0cc204f10d714d24f6a276279a3"
},
"downloads": -1,
"filename": "babyGPT-1.0.8.tar.gz",
"has_sig": false,
"md5_digest": "4567b750ffbcd6dfbcc11de4498ad0c7",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 683299,
"upload_time": "2025-08-01T03:12:02",
"upload_time_iso_8601": "2025-08-01T03:12:02.543070Z",
"url": "https://files.pythonhosted.org/packages/48/14/bfd8b181f9ec6c8b9e950cd57d6ac69e936e2f8295b6684ea00e8967c37d/babyGPT-1.0.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-01 03:12:02",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "babygpt"
}