<!-- filepath: c:\Users\ayode\ConstantA\matrixTransfomer\README.md -->
# MatrixTransformer
A unified Python framework for structure-preserving matrix transformations in high-dimensional decision space.
> 📘 Based on the paper: **MatrixTransformer: A Unified Framework for Matrix Transformations**
> 🔗 [Read the full paper on Zenodo](https://zenodo.org/records/15867279)
> 🧠 [Related project: QuantumAccel](https://github.com/fikayoAy/quantum_accel)
---
## 🧩 Overview
**MatrixTransformer** introduces a novel method for navigating between 16 matrix types (e.g., symmetric, Toeplitz, Hermitian, sparse) in a continuous, mathematically coherent space using a 16-dimensional decision hypercube.
🔹 Perform structure-preserving transformations
🔹 Quantify information-structure trade-offs
🔹 Interpolate between matrix types
🔹 Extendable with custom matrix definitions
🔹 Applications in ML, signal processing, quantum simulation, and more
## 📦 Installation
### Requirements
⚠️ Ensure you are using Python 3.8+ and have NumPy, SciPy, and optionally PyTorch installed.
### Clone from github and Install from wheel file
```bash
git clone https://github.com/fikayoAy/MatrixTransformer.git
cd MatrixTransformer
pip install dist/matrixtransformer-0.1.0-py3-none-any.whl
```
### Install dependencies
```bash
pip install numpy scipy torch
```
### Verify installation
```python
import MatrixTransformer
print("MatrixTransformer installed successfully!")
```
---
## 🔧 Basic Usage
### Initialize the transformer
```python
import numpy as np
from MatrixTransformer import MatrixTransformer
# Create a transformer instance
transformer = MatrixTransformer()
```
### Transform a matrix to a specific type
```python
# Create a sample matrix
matrix = np.random.randn(4, 4)
# Transform to symmetric matrix
symmetric_matrix = transformer.process_rectangular_matrix(matrix, 'symmetric')
# Transform to positive definite
positive_def = transformer.process_rectangular_matrix(matrix, 'positive_definite')
```
### Convert between tensors and matrices
```python
# Convert a 3D tensor to a 2D matrix representation
tensor = np.random.randn(3, 4, 5)
matrix_2d, metadata = transformer.tensor_to_matrix(tensor)
# Convert back to the original tensor
reconstructed_tensor = transformer.matrix_to_tensor(matrix_2d, metadata)
```
### Combine matrices
```python
# Combine two matrices using different strategies
matrix1 = np.random.randn(3, 3)
matrix2 = np.random.randn(3, 3)
# Weighted combination
combined = transformer.combine_matrices(
matrix1, matrix2, mode='weighted', weight1=0.6, weight2=0.4
)
# Other combination modes
max_combined = transformer.combine_matrices(matrix1, matrix2, mode='max')
multiply_combined = transformer.combine_matrices(matrix1, matrix2, mode='multiply')
```
### Add custom matrix types
```python
def custom_magic_matrix_rule(matrix):
"""Transform a matrix to have 'magic square' properties."""
n = matrix.shape[0]
result = matrix.copy()
target_sum = n * (n**2 + 1) / 2
# Simplified implementation for demonstration
# (For a real implementation, you would need proper balancing logic)
row_sums = result.sum(axis=1)
for i in range(n):
result[i, :] *= (target_sum / max(row_sums[i], 1e-10))
return result
# Add the new transformation rule
transformer.add_transform(
matrix_type="magic_square",
transform_rule=custom_magic_matrix_rule,
properties={"equal_row_col_sums": True},
neighbors=["diagonal", "symmetric"]
)
# Now use your custom transformation
magic_matrix = transformer.process_rectangular_matrix(matrix, 'magic_square')
```
---
## 🎯 Advanced Features
### Hypercube decision space navigation
```python
# Find optimal transformation path between matrix types
source_type = transformer._detect_matrix_type(matrix1)
target_type = 'positive_definite'
path, attention_scores = transformer._traverse_graph(matrix1, source_type=source_type)
# Apply path-based transformation
result = matrix1.copy()
for matrix_type in path:
transform_method = transformer._get_transform_method(matrix_type)
if transform_method:
result = transform_method(result)
```
### Hyperdimensional attention
```python
# Apply hyperdimensional attention for more robust transformations
query = np.random.randn(4, 4)
keys = [np.random.randn(4, 4) for _ in range(3)]
values = [np.random.randn(4, 4) for _ in range(3)]
result = transformer.hyperdimensional_attention(query, keys, values)
```
### AI Hypersphere Container
```python
# Create a hyperdimensional container for an AI entity
ai_entity = {"name": "Matrix Explorer", "capabilities": ["transform", "analyze"]}
container = transformer.create_ai_hypersphere_container(
ai_entity,
dimension=8,
base_radius=1.0
)
# Extract matrix from container
matrix = container['extract_matrix']()
# Update container state
container['update_state'](np.random.randn(8))
# Process temporal evolution of container
container['process_temporal_state']()
```
### Blended Matrix Construction
```python
# Create a blended matrix from multiple source matrices
matrix_indices = [0, 1, 2] # Indices of matrices to blend
blend_weights = [0.5, 0.3, 0.2] # Weights for blending
blended_matrix = transformer.blended_matrix_construction(
source_matrices=matrix_indices,
blend_weights=blend_weights,
target_type='symmetric',
preserve_properties=['energy'],
evolution_strength=0.1
)
```
---
## 🔁 Related Projects
- [QuantumAccel](https://github.com/fikayoAy/quantum_accel): A quantum-inspired system built on MatrixTransformer's transformation logic, modeling coherence, flow dynamics, and structure-evolving computations.
---
## 🧠 Citation
If you use this library in your work, please cite the paper:
```bibtex
@misc{ayodele2025matrixtransformer,
title={MatrixTransformer: A Unified Framework for Matrix Transformations},
author={Ayodele, Fikayomi},
year={2025},
doi={10.5281/zenodo.15867279},
url={https://zenodo.org/records/15867279}
}
```
---
## 📩 Contact
Questions, suggestions, or collaboration ideas?
Open an issue or reach out via Ayodeleanjola4@gmail.com/ 2273640@swansea.ac.uk
Raw data
{
"_id": null,
"home_page": "https://github.com/fikayoAy/MatrixTransformer",
"name": "MatrixTransformer",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "matrix-transformations, linear-algebra, matrix-operations, scientific-computing, machine-learning, quantum-simulation",
"author": "fikayoAy",
"author_email": "author@example.com",
"download_url": "https://files.pythonhosted.org/packages/fd/00/52a6db676cebf05a435034f756925644fb87aea4c4e836e76ab6e031f69e/matrixtransformer-0.1.0.tar.gz",
"platform": null,
"description": "<!-- filepath: c:\\Users\\ayode\\ConstantA\\matrixTransfomer\\README.md -->\n# MatrixTransformer\n\nA unified Python framework for structure-preserving matrix transformations in high-dimensional decision space.\n\n> \ud83d\udcd8 Based on the paper: **MatrixTransformer: A Unified Framework for Matrix Transformations** \n> \ud83d\udd17 [Read the full paper on Zenodo](https://zenodo.org/records/15867279) \n> \ud83e\udde0 [Related project: QuantumAccel](https://github.com/fikayoAy/quantum_accel)\n\n---\n\n## \ud83e\udde9 Overview\n\n**MatrixTransformer** introduces a novel method for navigating between 16 matrix types (e.g., symmetric, Toeplitz, Hermitian, sparse) in a continuous, mathematically coherent space using a 16-dimensional decision hypercube.\n\n\ud83d\udd39 Perform structure-preserving transformations \n\ud83d\udd39 Quantify information-structure trade-offs \n\ud83d\udd39 Interpolate between matrix types \n\ud83d\udd39 Extendable with custom matrix definitions \n\ud83d\udd39 Applications in ML, signal processing, quantum simulation, and more\n\n## \ud83d\udce6 Installation\n\n### Requirements\n\u26a0\ufe0f Ensure you are using Python 3.8+ and have NumPy, SciPy, and optionally PyTorch installed.\n\n### Clone from github and Install from wheel file\n```bash\ngit clone https://github.com/fikayoAy/MatrixTransformer.git\ncd MatrixTransformer\npip install dist/matrixtransformer-0.1.0-py3-none-any.whl\n```\n\n### Install dependencies\n```bash\npip install numpy scipy torch\n```\n\n### Verify installation\n```python\nimport MatrixTransformer\nprint(\"MatrixTransformer installed successfully!\")\n```\n\n---\n\n## \ud83d\udd27 Basic Usage\n\n### Initialize the transformer\n\n```python\nimport numpy as np\nfrom MatrixTransformer import MatrixTransformer\n\n# Create a transformer instance\ntransformer = MatrixTransformer()\n```\n\n### Transform a matrix to a specific type\n\n```python\n# Create a sample matrix\nmatrix = np.random.randn(4, 4)\n\n# Transform to symmetric matrix\nsymmetric_matrix = transformer.process_rectangular_matrix(matrix, 'symmetric')\n\n# Transform to positive definite\npositive_def = transformer.process_rectangular_matrix(matrix, 'positive_definite')\n```\n\n### Convert between tensors and matrices\n\n```python\n# Convert a 3D tensor to a 2D matrix representation\ntensor = np.random.randn(3, 4, 5)\nmatrix_2d, metadata = transformer.tensor_to_matrix(tensor)\n\n# Convert back to the original tensor\nreconstructed_tensor = transformer.matrix_to_tensor(matrix_2d, metadata)\n```\n\n### Combine matrices\n\n```python\n# Combine two matrices using different strategies\nmatrix1 = np.random.randn(3, 3)\nmatrix2 = np.random.randn(3, 3)\n\n# Weighted combination\ncombined = transformer.combine_matrices(\n matrix1, matrix2, mode='weighted', weight1=0.6, weight2=0.4\n)\n\n# Other combination modes\nmax_combined = transformer.combine_matrices(matrix1, matrix2, mode='max')\nmultiply_combined = transformer.combine_matrices(matrix1, matrix2, mode='multiply')\n```\n\n### Add custom matrix types\n\n```python\ndef custom_magic_matrix_rule(matrix):\n \"\"\"Transform a matrix to have 'magic square' properties.\"\"\"\n n = matrix.shape[0]\n result = matrix.copy()\n target_sum = n * (n**2 + 1) / 2\n \n # Simplified implementation for demonstration\n # (For a real implementation, you would need proper balancing logic)\n row_sums = result.sum(axis=1)\n for i in range(n):\n result[i, :] *= (target_sum / max(row_sums[i], 1e-10))\n \n return result\n\n# Add the new transformation rule\ntransformer.add_transform(\n matrix_type=\"magic_square\",\n transform_rule=custom_magic_matrix_rule,\n properties={\"equal_row_col_sums\": True},\n neighbors=[\"diagonal\", \"symmetric\"]\n)\n\n# Now use your custom transformation\nmagic_matrix = transformer.process_rectangular_matrix(matrix, 'magic_square')\n```\n\n---\n\n## \ud83c\udfaf Advanced Features\n\n### Hypercube decision space navigation\n\n```python\n# Find optimal transformation path between matrix types\nsource_type = transformer._detect_matrix_type(matrix1)\ntarget_type = 'positive_definite'\npath, attention_scores = transformer._traverse_graph(matrix1, source_type=source_type)\n\n# Apply path-based transformation\nresult = matrix1.copy()\nfor matrix_type in path:\n transform_method = transformer._get_transform_method(matrix_type)\n if transform_method:\n result = transform_method(result)\n```\n\n### Hyperdimensional attention\n\n```python\n# Apply hyperdimensional attention for more robust transformations\nquery = np.random.randn(4, 4)\nkeys = [np.random.randn(4, 4) for _ in range(3)]\nvalues = [np.random.randn(4, 4) for _ in range(3)]\n\nresult = transformer.hyperdimensional_attention(query, keys, values)\n```\n\n### AI Hypersphere Container\n\n```python\n# Create a hyperdimensional container for an AI entity\nai_entity = {\"name\": \"Matrix Explorer\", \"capabilities\": [\"transform\", \"analyze\"]}\ncontainer = transformer.create_ai_hypersphere_container(\n ai_entity, \n dimension=8,\n base_radius=1.0\n)\n\n# Extract matrix from container\nmatrix = container['extract_matrix']()\n\n# Update container state\ncontainer['update_state'](np.random.randn(8))\n\n# Process temporal evolution of container\ncontainer['process_temporal_state']()\n```\n\n### Blended Matrix Construction\n\n```python\n# Create a blended matrix from multiple source matrices\nmatrix_indices = [0, 1, 2] # Indices of matrices to blend\nblend_weights = [0.5, 0.3, 0.2] # Weights for blending\n\nblended_matrix = transformer.blended_matrix_construction(\n source_matrices=matrix_indices,\n blend_weights=blend_weights,\n target_type='symmetric',\n preserve_properties=['energy'],\n evolution_strength=0.1\n)\n```\n\n---\n\n## \ud83d\udd01 Related Projects\n\n- [QuantumAccel](https://github.com/fikayoAy/quantum_accel): A quantum-inspired system built on MatrixTransformer's transformation logic, modeling coherence, flow dynamics, and structure-evolving computations.\n\n---\n\n## \ud83e\udde0 Citation\n\nIf you use this library in your work, please cite the paper:\n\n```bibtex\n@misc{ayodele2025matrixtransformer,\n title={MatrixTransformer: A Unified Framework for Matrix Transformations},\n author={Ayodele, Fikayomi},\n year={2025},\n doi={10.5281/zenodo.15867279},\n url={https://zenodo.org/records/15867279}\n}\n```\n\n---\n\n## \ud83d\udce9 Contact\n\nQuestions, suggestions, or collaboration ideas?\nOpen an issue or reach out via Ayodeleanjola4@gmail.com/ 2273640@swansea.ac.uk\n",
"bugtrack_url": null,
"license": null,
"summary": "A unified framework for structure-preserving matrix transformations",
"version": "0.1.0",
"project_urls": {
"Bug Tracker": "https://github.com/fikayoAyMatrixTransformer/issues",
"Documentation": "https://github.com/fikayoAy/MatrixTransformer#readme",
"Homepage": "https://github.com/fikayoAy/MatrixTransformer",
"Related Project": "https://github.com/fikayoAy/quantum_accel"
},
"split_keywords": [
"matrix-transformations",
" linear-algebra",
" matrix-operations",
" scientific-computing",
" machine-learning",
" quantum-simulation"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "24dd196e6dabd6f8b1d58be0c75de6df33b931be742d3d8d60e6731aa3d74ade",
"md5": "12b4ecf78a06157809f70ef163356a0f",
"sha256": "75a00c762c8d0e68435fba2771a5b4dbe0a9b71557d66fd5145437ff4e7a1b4b"
},
"downloads": -1,
"filename": "matrixtransformer-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "12b4ecf78a06157809f70ef163356a0f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 4602,
"upload_time": "2025-07-13T12:07:17",
"upload_time_iso_8601": "2025-07-13T12:07:17.563617Z",
"url": "https://files.pythonhosted.org/packages/24/dd/196e6dabd6f8b1d58be0c75de6df33b931be742d3d8d60e6731aa3d74ade/matrixtransformer-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "fd0052a6db676cebf05a435034f756925644fb87aea4c4e836e76ab6e031f69e",
"md5": "cf601b599944746767bbffe7f8999f0a",
"sha256": "5edb47a43c9ad6f83c313a17d60b7694915d83230955252dcf4951548739502c"
},
"downloads": -1,
"filename": "matrixtransformer-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "cf601b599944746767bbffe7f8999f0a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 4900,
"upload_time": "2025-07-13T12:07:18",
"upload_time_iso_8601": "2025-07-13T12:07:18.933543Z",
"url": "https://files.pythonhosted.org/packages/fd/00/52a6db676cebf05a435034f756925644fb87aea4c4e836e76ab6e031f69e/matrixtransformer-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-13 12:07:18",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "fikayoAy",
"github_project": "MatrixTransformer",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numpy",
"specs": [
[
">=",
"1.20.0"
]
]
},
{
"name": "scipy",
"specs": [
[
">=",
"1.6.0"
]
]
},
{
"name": "torch",
"specs": [
[
">=",
"1.9.0"
]
]
},
{
"name": "scikit-learn",
"specs": [
[
">=",
"0.24.0"
]
]
}
],
"lcname": "matrixtransformer"
}