PyPi Library :- dl-assistant Documentation
=======================================
## Table of Contents
- [Installation](#installation)
- [Usage - Image Classification](#usage-img-class)
- [Examples](#examples)
- [API Reference](#api-reference)
## Installation
It can be easily installed by typing the following in the terminal:
```bash
pip install dl-assistant
```
Usage - Image classification
----------------------------
1. **create\_dataframe**\- This function takes dir containing training or test data and convert it into pandas dataframe that is 1st step towards deep learning
#### Simple Usage
from dl_assistant.image_cnn
import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
#Continue on your own
2. **prep\_x\_train**\- This function takes image paths, height and width to resize and returns an array of features of training data. It itself divide features by 255.0
#### Simple Usage
from dl_assistant.image_cnn import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
x_train = x.prep_x_train(df['Images',100,100])# get features divided by 255.0
3. **prep\_y\_train**\- This function takes labels and number of classes as input and converts labels into binary metrics.
#### Simple Usage
from dl_assistant.image_cnn import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
x_train = x.prep_x_train(df['Images',100,100])
y_train = x.prep_y_train(df['Labels'],7)
4. **make\_model**\- This function takes unit of 1 stconv2D layer and number of classes as input and returns a basic non-trained model.
#### Simple Usage
from dl_assistant.image_cnn import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
x_train = x.prep_x_train(df['Images',100,100])
y_train = x.prep_y_train(df['Labels'],7)
shape=x_train[0].shape
model = x.make_model(128,7,shape)
5. **expert\_make\_model**\- This is a function developed for expert.
* **Syntax** \=expert\_make\_model(self,layers,unit,num\_classes,input\_shape,dr)
* **layers**\= This is a string list that can contain either Conv2D,MaxPooling2D or Dropout
* **unit**\= This is the integer that contains the unit of 1st Conv2D layer
* **num\_classes**\= This is the count of no. of labels
* **input\_shape**\= This is the input shape or shape of x\_train\[0\]
* **dr**\= The dropout rate
#### Simple Usage
from dl_assistant.image_cnn import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
x_train = x.prep_x_train(df['Images',100,100])
y_train = x.prep_y_train(df['Labels'],7)
shape=x_train[0].shape
model = x.expert_make_model(['Conv2D','Conv2D','MaxPooling2D','Dropout'],128,7,shape,0.5)
6. **train\_model**\- This function takes model to train, x\_train , y\_train, epochs and batch\_size and returns a trained model
#### Simple Usage
from dl_assistant.image_cnn import classification
x=classification()
df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
x_train = x.prep_x_train(df['Images',100,100])
y_train = x.prep_y_train(df['Labels'],7)
shape=x_train[0].shape
model = x.make_model(128,7,shape)
batch_size=32
epochs=70
model.x.train_model(model,x_train,y_train,batch_size,epochs)
Examples
--------
Create a emotion-classification model
`from dl_assistant.image_cnn import classification model = classification() model = model.create('TRAIN',7) model.predict([x_test[0]])`
Raw data
{
"_id": null,
"home_page": "",
"name": "dl-assistant",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "Automated Deep Learning,Model Generation,Neural Network Automation,Deep Learning Automation,AI Model Builder,Neural Architecture Search (NAS),Automated Model Design,Deep Learning Toolkit,Effortless Model Creation,Model Composer,AI Model Automation,Neurogeneration,DL Model Composer,Smart Model Builder,AI Model Generator,Ayush Agrawal,tensorflow,keras",
"author": "Ayush Agrawal",
"author_email": "aagrawal963@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/e6/1d/a4401a59302d968d4245a76b9e872e31c7499b129dc9c10ef106d0cb5a78/dl-assistant-0.0.6.tar.gz",
"platform": null,
"description": "\r\nPyPi Library :- dl-assistant Documentation\r\n=======================================\r\n\r\n\r\n## Table of Contents\r\n- [Installation](#installation)\r\n- [Usage - Image Classification](#usage-img-class)\r\n- [Examples](#examples)\r\n- [API Reference](#api-reference)\r\n\r\n## Installation\r\nIt can be easily installed by typing the following in the terminal:\r\n\r\n```bash\r\npip install dl-assistant\r\n```\r\n\r\n\r\nUsage - Image classification\r\n----------------------------\r\n\r\n1. **create\\_dataframe**\\- This function takes dir containing training or test data and convert it into pandas dataframe that is 1st step towards deep learning \r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn\r\n import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n #Continue on your own\r\n \r\n \r\n2. **prep\\_x\\_train**\\- This function takes image paths, height and width to resize and returns an array of features of training data. It itself divide features by 255.0 \r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n x_train = x.prep_x_train(df['Images',100,100])# get features divided by 255.0 \r\n \r\n \r\n3. **prep\\_y\\_train**\\- This function takes labels and number of classes as input and converts labels into binary metrics.\r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n x_train = x.prep_x_train(df['Images',100,100])\r\n y_train = x.prep_y_train(df['Labels'],7)\r\n \r\n \r\n4. **make\\_model**\\- This function takes unit of 1 stconv2D layer and number of classes as input and returns a basic non-trained model.\r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n x_train = x.prep_x_train(df['Images',100,100])\r\n y_train = x.prep_y_train(df['Labels'],7)\r\n shape=x_train[0].shape\r\n model = x.make_model(128,7,shape)\r\n \r\n \r\n5. **expert\\_make\\_model**\\- This is a function developed for expert.\r\n \r\n * **Syntax** \\=expert\\_make\\_model(self,layers,unit,num\\_classes,input\\_shape,dr)\r\n \r\n * **layers**\\= This is a string list that can contain either Conv2D,MaxPooling2D or Dropout\r\n * **unit**\\= This is the integer that contains the unit of 1st Conv2D layer\r\n * **num\\_classes**\\= This is the count of no. of labels\r\n * **input\\_shape**\\= This is the input shape or shape of x\\_train\\[0\\]\r\n * **dr**\\= The dropout rate\r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n x_train = x.prep_x_train(df['Images',100,100])\r\n y_train = x.prep_y_train(df['Labels'],7)\r\n shape=x_train[0].shape\r\n model = x.expert_make_model(['Conv2D','Conv2D','MaxPooling2D','Dropout'],128,7,shape,0.5)\r\n \r\n \r\n6. **train\\_model**\\- This function takes model to train, x\\_train , y\\_train, epochs and batch\\_size and returns a trained model\r\n \r\n #### Simple Usage\r\n \r\n \r\n from dl_assistant.image_cnn import classification\r\n x=classification()\r\n df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another\r\n x_train = x.prep_x_train(df['Images',100,100])\r\n y_train = x.prep_y_train(df['Labels'],7)\r\n shape=x_train[0].shape\r\n model = x.make_model(128,7,shape)\r\n batch_size=32\r\n epochs=70\r\n model.x.train_model(model,x_train,y_train,batch_size,epochs)\r\n \r\n \r\n\r\nExamples\r\n--------\r\n\r\nCreate a emotion-classification model\r\n\r\n `from dl_assistant.image_cnn import classification model = classification() model = model.create('TRAIN',7) model.predict([x_test[0]])`\r\n \r\n \r\n",
"bugtrack_url": null,
"license": "",
"summary": "A libirary to automate developing keras models",
"version": "0.0.6",
"project_urls": null,
"split_keywords": [
"automated deep learning",
"model generation",
"neural network automation",
"deep learning automation",
"ai model builder",
"neural architecture search (nas)",
"automated model design",
"deep learning toolkit",
"effortless model creation",
"model composer",
"ai model automation",
"neurogeneration",
"dl model composer",
"smart model builder",
"ai model generator",
"ayush agrawal",
"tensorflow",
"keras"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "371407b905afe4883c253d8dc6cdbc657b7b7bc86b31656aa337495ea534c4d1",
"md5": "46b987fd77e109e3709eeead8a99c621",
"sha256": "0a7a2659cb1bdce9abb1d5b2d8407dd47ba621c50f3c4d9005af95f9700f4605"
},
"downloads": -1,
"filename": "dl_assistant-0.0.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "46b987fd77e109e3709eeead8a99c621",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 4978,
"upload_time": "2024-01-23T12:41:06",
"upload_time_iso_8601": "2024-01-23T12:41:06.363523Z",
"url": "https://files.pythonhosted.org/packages/37/14/07b905afe4883c253d8dc6cdbc657b7b7bc86b31656aa337495ea534c4d1/dl_assistant-0.0.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e61da4401a59302d968d4245a76b9e872e31c7499b129dc9c10ef106d0cb5a78",
"md5": "d9e96edc7815815aeb6cca8ca2e16aee",
"sha256": "e6a4b65835dbd32644c256eca73bfb9548af2d833a828b60677c7aa9acb431f2"
},
"downloads": -1,
"filename": "dl-assistant-0.0.6.tar.gz",
"has_sig": false,
"md5_digest": "d9e96edc7815815aeb6cca8ca2e16aee",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 4813,
"upload_time": "2024-01-23T12:41:08",
"upload_time_iso_8601": "2024-01-23T12:41:08.228120Z",
"url": "https://files.pythonhosted.org/packages/e6/1d/a4401a59302d968d4245a76b9e872e31c7499b129dc9c10ef106d0cb5a78/dl-assistant-0.0.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-23 12:41:08",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "dl-assistant"
}