This version of code is refactored by Vedant Mathur
# TAR Software Package Repository
Code combining preprocessing, data collection, training and inference to generate automated disaster reports.
## Key Files
* `tar_main.py` - File that consolidates relevant functions to produce a report
* date2template* - Files that do different collectiong/processing of USGIS data to be added to the briefings
* `classifiers.py` - Calls classifiers (regression, SVN, GAN, CNN) and runs a majority vote to determine the final classification for sentences according to 4 categories (buildings, infrastructure, resilience, other)
* `resilience_curve.py` - Generates resilience curves, and calculates t0 and t1 (to calculate recovery time for disaster)
* `config.ini` - Set of parameters to control briefing generation
* data - Folder containing log of earthquakes, tweets and news articles
## Usage
**Generating a report**
To generate a report, run
```
python -m tear
```
This will iterate through earthquakes listed in the earthquake log and output a report to the "reports" directory.
### Generating a resilience curve
To do this, call the `generateResilience` function in `resilience_curve.py`. It takes the following parameters -
* `ruptureTime` - Reference time to when the earthquake happened (e.g. 2021-02-24 02:05:59)
* `twitterFile` - CSV with tweets for earthquake
* `keywords` - keywords to filter tweets by
For example:
```python
generateResilience("2021-02-24 02:05:59", "data/tweets/ArgentinaTweets.csv", ["electricity", "lights"])
```
Raw data
{
"_id": null,
"home_page": null,
"name": "tear",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "dynamics, integration, seismic, earthquake-engineering",
"author": null,
"author_email": "STAIRlab <50180406+claudioperez@users.noreply.github.com>",
"download_url": "https://files.pythonhosted.org/packages/88/7e/d8e594b94f4c3bcf6be33cdef8a058b7a6c11ec6a56910060c4d270f1198/tear-0.0.2.tar.gz",
"platform": null,
"description": "This version of code is refactored by Vedant Mathur\n\n# TAR Software Package Repository\n\nCode combining preprocessing, data collection, training and inference to generate automated disaster reports.\n\n## Key Files \n* `tar_main.py` - File that consolidates relevant functions to produce a report \n* date2template* - Files that do different collectiong/processing of USGIS data to be added to the briefings \n* `classifiers.py` - Calls classifiers (regression, SVN, GAN, CNN) and runs a majority vote to determine the final classification for sentences according to 4 categories (buildings, infrastructure, resilience, other) \n* `resilience_curve.py` - Generates resilience curves, and calculates t0 and t1 (to calculate recovery time for disaster) \n* `config.ini` - Set of parameters to control briefing generation\n* data - Folder containing log of earthquakes, tweets and news articles\n\n## Usage\n**Generating a report**\n\nTo generate a report, run \n\n``` \npython -m tear\n```\n\nThis will iterate through earthquakes listed in the earthquake log and output a report to the \"reports\" directory. \n\n### Generating a resilience curve\n\nTo do this, call the `generateResilience` function in `resilience_curve.py`. It takes the following parameters - \n\n* `ruptureTime` - Reference time to when the earthquake happened (e.g. 2021-02-24 02:05:59)\n* `twitterFile` - CSV with tweets for earthquake\n* `keywords` - keywords to filter tweets by\n\n\nFor example:\n\n```python\ngenerateResilience(\"2021-02-24 02:05:59\", \"data/tweets/ArgentinaTweets.csv\", [\"electricity\", \"lights\"])\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "TExt Analytics for Reconnaissance.",
"version": "0.0.2",
"project_urls": {
"Repository": "http://github.com/STAIRlab"
},
"split_keywords": [
"dynamics",
" integration",
" seismic",
" earthquake-engineering"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "81e6ee6cf916a52a2166ad11a9427357bb41cb50d812f163922705a68510ddef",
"md5": "6beafa78b83ed03dfb3fc95eb70b30cb",
"sha256": "6e3fc0e19e8f97be4b320f09cc9ae64d1084e4b59b257431a13feb1572ff7581"
},
"downloads": -1,
"filename": "tear-0.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6beafa78b83ed03dfb3fc95eb70b30cb",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 15719,
"upload_time": "2024-12-14T04:17:16",
"upload_time_iso_8601": "2024-12-14T04:17:16.918372Z",
"url": "https://files.pythonhosted.org/packages/81/e6/ee6cf916a52a2166ad11a9427357bb41cb50d812f163922705a68510ddef/tear-0.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "887ed8e594b94f4c3bcf6be33cdef8a058b7a6c11ec6a56910060c4d270f1198",
"md5": "baa679a8d7b0c13569bceb64f1f7617b",
"sha256": "95c87757511070a0bf69f2e0a0665fe0ae941c73ffab6a1879db5ac90a6ead2f"
},
"downloads": -1,
"filename": "tear-0.0.2.tar.gz",
"has_sig": false,
"md5_digest": "baa679a8d7b0c13569bceb64f1f7617b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 15524,
"upload_time": "2024-12-14T04:17:18",
"upload_time_iso_8601": "2024-12-14T04:17:18.222159Z",
"url": "https://files.pythonhosted.org/packages/88/7e/d8e594b94f4c3bcf6be33cdef8a058b7a6c11ec6a56910060c4d270f1198/tear-0.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-14 04:17:18",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "tear"
}