foc


Namefoc JSON
Version 0.4.6 PyPI version JSON
download
home_pagehttps://github.com/thyeem/foc
SummaryA collection of python functions for somebody's sanity
upload_time2024-01-16 06:26:40
maintainer
docs_urlNone
authorFrancis Lim
requires_python>=3.6
licenseMIT
keywords functional functools functional-python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            # foc

![foc](https://img.shields.io/pypi/v/foc)

`fun oriented code` or `francis' odd collection`.


Functions from the `Python` standard library are great. But some notations are a bit painful and confusing for personal use, so I created this _odd collection of functions_.


## Tl;dr

- `foc` provides a collection of ___higher-order functions___, some helpful (_pure_) _functions_.
- `foc` provides an easy way to ___compose functions with symbol___. (`.` and `|`)
- `foc` respects the `python` standard library. _Never reinvented the wheel_.


## How to use
```bash
# install
$ pip install -U foc

# import
>>> from foc import *
```

## Ground rules
- _No dependencies_ except for the `python` standard library
- _No unnessary wrapping_ objects.
- Most function implementations _should be less than 5-lines_.
- Followed `haskell`-like function names and arguments order
- Used `python` generator first if possible. (_lazy-evaluation_)
  > `map`, `filter`, `zip`, `range`, `flat` ...

- Provide the functions that unpack generators in `list` as well. 
- Function names that end in `l` indicate the result will be unpacked in a list.
  > `mapl`, `filterl`, `zipl`, `rangel`, `flatl`, `takewhilel`, `dropwhilel`, ...
- Function names that end in `_` indicate that the function is a _partial application_ builder.
  > `cf_`, `f_`, `ff_`, `c_`, `cc_`, `u_`, ...

## Quickstart

Let's pick lottery numbers. That is to pick 6 numbers from 1 to 45. _The Lotto_ [_(korean lottery)_](https://en.lottolyzer.com/home/south-korea/6_slash_45-lotto)
```python
>>> range(1, 46) | choice(size=6) | sort
[2, 8, 22, 24, 37, 39]

# This is one game. People usually buy five games at once.

>>> [ range(1, 46) | choice(size=6) | sort for _ in range(5) ]
[[4, 6, 11, 38, 41, 45],
 [5, 8, 23, 25, 26, 40],
 [13, 18, 23, 25, 37, 44],
 [17, 21, 24, 32, 41, 43],
 [5, 9, 13, 25, 30, 38]]

>>> gumballs = replicate(5, range(1, 46))     # 5-set of gumballs

# in Unix pipelines manner
>>> gumballs | map(choice(size=6)) | map(sort) | collect
[[1, 3, 5, 10, 23, 41],
 [14, 18, 28, 33, 37, 39],
 [13, 15, 19, 23, 32, 45],
 [4, 11, 19, 27, 30, 39],
 [8, 33, 35, 39, 40, 41]]

# with Haskell-like mathematical symbol
>>> (collect . map(sort . fx(choice(size=6))))(gumballs)
[[4, 14, 15, 28, 42, 44],
 [12, 34, 37, 40, 41, 42],
 [7, 10, 21, 26, 31, 39],
 [6, 11, 12, 14, 25, 32],
 [2, 13, 15, 26, 27, 41]]
```

The functions `foc` provides are not particularly different. Exactly, it look like normal functions.  
```python
>>> id("francis")
'francis'

>>> even(3)
False

>>> take(3, range(5, 10))
[5, 6, 7]
```

`foc` just adds ways to __compose functions with symbols__
| symbol          | description                     | evaluation order | Available functions            |
|-----------------|---------------------------------|------------------|--------------------------------|
| `.` (dot)       | same as the mathematical symbol | backwards        | all _globals_, all _built-ins_ |
| `\|` (pipeline) | in Unix pipeline manner         | in order         | `@fx`-_decorated functions_    |

### Composition of Functions with `.`
```python
>>> (length . range)(10)
10

>>> (collect . filter(even) . range)(10)  # 'collect' unpacks generators
[0, 2, 4, 6, 8]

>>> (sum . map(f_("+", 5)) . range)(10)   # f_("+", 5) == lambda x: x+5
95

>>> (last . sort . shuffle . collect . range)(11)
10
```

all functions in `globals()` including all `built-ins` can be direcly composed by `.`, except for __two__.


- `lambda` 
- _partial application_ like: `partial(map, lambda x: x+5)`
  > the same as `map(f_("+", 5))`. They are interchangeable.   

In those case, __just wrap them in `fx`__.
```python
>>> (fx(lambda x: x+2) . fx(lambda x: x*8))(5)    # don't. fx(lambda *args, **kwargs: ...)
42
>>> (id . f_("+", 2) . f_("*", 8))(5)             # isn't it better?
42

>>> (sum . fx(partial(map, lambda x: x+5)))(range(5))   # don't partial(map, lambda ...)
37
>>> (sum . map(f_("+", 5)))(range(5))                   # `map(f_("+", 5))` is enough
37

>>> (unchars . map(chr))(range(73, 82))
'IJKLMNOPQ'

>>> (collect . map(pred . succ) . range)(5)
[0, 1, 2, 3, 4]
```
But, it's very tedious work wrapping partial `map` in `fx` every time. Thus, `f_`, `ff_`, `curry`, `uncurry`, `map`, and `filter` have been processed __so that they can be used without `fx`__. 
> _See also_: `f_`, `ff_`, `c_`, `cc_`, `u_`, `curry`, `uncurry`, `map`, and `filter`

### Composition of Functions with `|`
```python
>>> range(10) | length
10

>>> range(10) | filter(even) | collect    # 'collect' unpacks generators
[0, 2, 4, 6, 8]

>>> range(10) | map(f_("+", 5)) | sum     # f_("+", 5) == lambda x: x+5
95

>>> rangel(11) | shuffle | sort | last
10
```

Unlike the case of `.`, composing functions with `|` is allowed only for `composable` functions (or `fx` function). But don't worry. Most functions `foc` provides are the `fx` function.   

`fx` functions (or _Function eXtension_) are `@fx`-decorated functions.
> To list/get all available `fx` functions, call `catalog(fx=True)` or `lsfx()`.


If you want to make a function the `fx` function on the fly, __just wrap the function in `fx`__. 
```python
>>> 7 | fx(lambda x: x * 6)
42
```

Try binding a function to a new reference:
```python
>>> foo = fx(func)
```

or use `fx` decorator. All the same. 
```python
>>> @fx               # from now on, arg | func == func(arg)
... def func(arg):     
...    ...
```


## Examples
These are part of the _symbol-composable_ functions `foc` provides.  
> To list all available functions, call `catalog()`.

### Basic (pure) functions 
```python
>>> id("francis")
'francis'

>>> const(5, "no-matther-what-comes-here")
5

>>> seq("only-returns-the-following-arg", 5)
5

>>> void(randbytes(256))

>>> fst(["sofia", "maria", "claire"])
'sofia'

>>> snd(("sofia", "maria", "claire"))
'maria'

>>> nth(3, ["sofia", "maria", "claire"]) 
'claire'

>>> take(3, range(5, 10))
[5, 6, 7]

>>> drop(3, "github") | collect 
['h', 'u', 'b']

>>> head(range(1,5))              # returns 'None' when []
1

>>> last(range(1,5))              # returns 'None' when []
4

>>> init(range(1,5)) | collect    # returns [] when []
[1, 2, 3]

>>> tail(range(1,5)) | collect    # returns [] when []
[2, 3, 4]

>>> pair("sofia", "maria")
('sofia', 'maria')

>>> pred(3)
2

>>> succ(3)
4

>>> odd(3)
True

>>> even(3)
False

>>> null([]) == null(()) == null({}) == null("")
True

>>> elem(5, range(10))
True

>>> not_elem("fun", "functions")
False

>>> nub("3333-13-1111111")
['3', '-', '1']

>>> chars("sofimarie")
['s', 'o', 'f', 'i', 'm', 'a', 'r', 'i', 'e']

>>> unchars(['s', 'o', 'f', 'i', 'm', 'a', 'r', 'i', 'e'])
'sofimarie'

>>> words("fun on functions")
['fun', 'on', 'functions']

>>> unwords(['fun', 'on', 'functions'])
'fun on functions'

>>> lines("fun\non\nfunctions")
['fun', 'on', 'functions']

>>> unlines(['fun', 'on', 'functions'])
"fun\non\nfunctions"

>>> take(3, repeat(5))        # repeat(5) = [5, 5, ...]
[5, 5, 5]

>>> take(5, cycle("fun"))     # cycle("fun") = ['f', 'u', 'n', 'f', 'u', 'n', ...]
['f', 'u', 'n', 'f', 'u']

>>> replicate(3, 5)           # the same as 'take(3, repeat(5))'
[5, 5, 5]

>>> take(3, count(2))         # count(2) = [2, 3, 4, 5, ...]
[2, 3, 4]

>>> take(3, count(2, 3))      # count(2, 3) = [2, 5, 8, 11, ...]
[2, 5, 8]
```
### Higher-order functions
```python
>>> flip(pow)(7, 3)                             # the same as `pow(3, 7) = 3 ** 7`
2187

>>> bimap(f_("+", 3), f_("*", 7), (5, 7))       # bimap (3+) (7*) (5, 7)
(8, 49)                                         # (3+5, 7*7)

>>> first(f_("+", 3), (5, 7))                   # first (3+) (5, 7)
(8, 7)                                          # (3+5, 7)

>>> second(f_("*", 7), (5, 7))                  # second (7*) (5, 7)
(5, 49)                                         # (5, 7*7)

>>> take(5, iterate(lambda x: x**2, 2))         # [2, 2**2, (2**2)**2, ((2**2)**2)**2, ...]
[2, 4, 16, 256, 65536]

>>> [* takewhile(even, [2, 4, 6, 1, 3, 5]) ]    
[2, 4, 6]

>>> takewhilel(even, [2, 4, 6, 1, 3, 5])
[2, 4, 6]

>>> [* dropwhile(even, [2, 4, 6, 1, 3, 5]) ]    
[1, 3, 5]

>>> dropwhilel(even, [2, 4, 6, 1, 3, 5])
[1, 3, 5]

# fold with a given initial value from the left
>>> foldl("-", 10, range(1, 5))                 # foldl (-) 10 [1..4]
0

# fold with a given initial value from the right
>>> foldr("-", 10, range(1, 5))                 # foldr (-) 10 [1..4]
8

# `foldl` without an initial value (used first item instead)
>>> foldl1("-", range(1, 5))                    # foldl1 (-) [1..4]
-8

# `foldr` without an initial value (used first item instead)
>>> foldr1("-", range(1, 5))                    # foldr1 (-) [1..4]
-2

# accumulate reduced values from the left
>>> scanl("-", 10, range(1, 5))                 # scanl (-) 10 [1..4]
[10, 9, 7, 4, 0]

# accumulate reduced values from the right
>>> scanr("-", 10, range(1, 5))                 # scanr (-) 10 [1..4]
[8, -7, 9, -6, 10]

# `scanl` but no starting value
>>> scanl1("-", range(1, 5))                    # scanl1 (-) [1..4]
[1, -1, -4, -8]

# `scanr` but no starting value
>>> scanr1("-", range(1, 5))                    # scanr1 (-) [1..4]
[-2, 3, -1, 4]

>>> concatl(["sofia", "maria"])
['s', 'o', 'f', 'i', 'a', 'm', 'a', 'r', 'i', 'a']
# Note that ["sofia", "maria"] = [['s','o','f','i','a'], ['m','a','r','i','a']]

>>> concatmapl(str.upper, ["sofia", "maria"])   
['S', 'O', 'F', 'I', 'A', 'M', 'A', 'R', 'I', 'A']
```


### Real-World Example
A causal self-attention of the `transformer` model based on `pytorch` can be described as follows.  
_Somebody_ insists that this helps to follow the process flow without distraction.

```python
    def forward(self, x):
        B, S, E = x.size()  # size_batch, size_block (sequence length), size_embed
        N, H = self.config.num_heads, E // self.config.num_heads  # E == (N * H)

        q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)
        q = q.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        k = k.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)
        v = v.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)

        # Attention(Q, K, V)
        #   = softmax( Q*K^T / sqrt(d_k) ) * V
        #         // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)
        #   = attention-prob-matrix * V
        #         // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
        #   = attention-weighted value (attention score)

        return cf_(
            self.dropout,  # dropout of layer's output
            self.c_proj,  # linear projection
            ff_(torch.Tensor.view, *rev(B, S, E)),  # (B, S, N, H) -> (B, S, E)
            torch.Tensor.contiguous,  # contiguos in-memory tensor
            ff_(torch.transpose, *rev(1, 2)),  # (B, S, N, H)
            ff_(torch.matmul, v),  # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)
            self.dropout_attn,  # attention dropout
            ff_(torch.masked_fill, *rev(mask == 0, 0.0)),  # double-check masking
            f_(F.softmax, dim=-1),  # softmax
            ff_(torch.masked_fill, *rev(mask == 0, float("-inf"))),  # no-look-ahead
            ff_("/", math.sqrt(k.size(-1))),  # / sqrt(d_k)
            ff_(torch.matmul, k.transpose(-2, -1)),  # Q @ K^T -> (B, N, S, S)
        )(q)
```

## In Detail
### Get binary functions from `python` operators: `sym`
`sym(OP)` converts `python`'s _symbolic operators_ into _binary functions_.  
The string forms of operators like `+`, `-`, `/`, `*`, `**`, `==`, `!=`, .. represent the corresponding binary functions.
> To list all available symbol operators, call `sym()`.

```python
>>> sym("+")(5, 2)                 # 5 + 2
7

>>> sym("==")("sofia", "maria")    # "sofia" == "maria"
False

>>> sym("%")(123456, 83)           # 123456 % 83
35
```

### Build partial application: `f_` and `ff_`
- `f_` build left-associative partial application,  
where the given function's arguments partially evaluation _from the left_.
- `ff_` build right-associative partial application,  
where the given function's arguments partially evaluation _from the right_.

> `f_(fn, *args, **kwargs)`  
> `ff_(fn, *args, **kwargs) == f_(flip(fn), *args, **kwargs)`  

```python
>>> f_("+", 5)(2)    # the same as `(5+) 2` in Haskell
7                    # 5 + 2

>>> ff_("+", 5)(2)   # the same as `(+5) 2 in Haskell`
7                    # 2 + 5

>>> f_("-", 5)(2)    # the same as `(5-) 2`
3                    # 5 - 2

>>> ff_("-", 5)(2)   # the same as `(subtract 5) 2`
-3                   # 2 - 5
```

### Build curried functions: `c_` (`curry`) and `cc_`
- `c_` is an alias for `curry`
- `c_` takes the function's arguments _from the left_ 
- while `cc_` takes them _from the right_.

> `c_(fn) == curry(fn)`   
> `cc_(fn) == c_(flip(fn))`

See also `uncurry`

```python
# currying from the left args
>>> c_("+")(5)(2)    # 5 + 2
7

>>> c_("-")(5)(2)    # 5 - 2
3

# currying from the right args
>>> cc_("+")(5)(2)   # 2 + 5
7

>>> cc_("-")(5)(2)   # 2 - 5
-3
```

### Build unary functions on a tuple: `u_` (`uncurry`)
- `u_` is an alias for `uncurry`
- `u_` converts a _normal function_ to __a unary function that takes a tuple of arguments__ only
- `uncurry :: (a -> ... -> b -> o) -> (a, ..., b) -> o`

```python
>>> uncurry(pow)((2, 10))    # pow(2, 10)
1024

>>> (2, 3) | uncurry("+")    # 2 + 3 or (+) 2 3 
5

>>> ([1, 3], [2, 4]) | uncurry(zip) | collect    # collect(zip([1, 3], [2, 4]))
[(1, 2), (3, 4)]

>>> (collect . uncurry(zip))(([1,3], [2,4],))     # the same
[(1, 2), (3, 4)]
```




### Build composition of functions: `cf_` and `cfd`
- `cf_` (_composition of function_) composes functions using the given list of functions. 
- `cfd` (_composing-function decorator_) decorates a function with the given list of functions.

> `cf_(*fn, rep=None)`  
> `cfd(*fn, rep=None)`

```python
>>> square = ff_("**", 2)        # the same as (^2) in Haskell
>>> add5 = ff_("+", 5)           # the same as (+5) in Haskell
>>> mul7 = ff_("*", 7)           # the same as (*7) in Haskell

>>> cf_(mul7, add5, square)(3)   # (*7) . (+5) . (^2) $ 3
98                               # mul7(add5(square(3))) = ((3 ^ 2) + 5) * 7

>>> cf_(square, rep=3)(2)        # cf_(square, square, square)(2) == ((2 ^ 2) ^ 2) ^ 2 = 256
256


>>> @cfd(mul7, add5, square)
... def foo(x):
...    return len(x)

>>> foo([1,2,3])
98

# compare `cf_` with `cfd`
cf_(a, b, c, d, f)(x)    # (a . b . c . d . f)(x) 

cfd(a, b, c, d)(f)(x)    # (a . b . c . d)(f(x))
```

`cfd` is very handy and useful to recreate previously defined functions by composing functions. All you need is to write a basic functions to do fundamental things.

### Seamlessly extends: `map`, `filter` and `zip`
- Extend usability while _maintaining full compatibility_
- _No harm_ to existing usage. Just __added ways to compose function with symbols__

> `map(fn, *xs)`   
> `mapl(fn, *xs)`  
```python
>>> (collect . map(abs))(range(-2, 3)) 
[2, 1, 0, 1, 2]
>>> map(abs)(range(-2, 3)) | collect
[2, 1, 0, 1, 2]

>>> (collect . map(lambda x: x*8))(range(1, 6))
[8, 16, 24, 32, 40]
>>> range(1, 6) | map(lambda x: x*8) | collect
[8, 16, 24, 32, 40]

>>> (collect . map("*", [1, 2, 3]))([4, 5, 6])
[4, 10, 18]
>>> [4, 5, 6] | map("*", [1, 2, 3]) | collect
[4, 10, 18]
```

> `filter(p, xs)`  
> `filterl(p, xs)`
```python
>>> (collect . filter(f_("==", "f")))("fun-on-functions")
['f', 'f']
>>> filter(f_("==", "f"))("fun-on-functions") | collect
['f', 'f']

>>> primes = [2, 3, 5, 7, 11, 13, 17, 19]
>>> (collect . filter(lambda x: x % 3 == 2))(primes)
[2, 5, 11, 17]
>>> primes | filter(cf_(ff_("==", 2), ff_("%", 3))) | collect
[2, 5, 11, 17]
```

> `zip(*xs, strict=False)`  
> `zipl(*xs, strict=False)`  

```python
>>> (collect . f_(zip, "LOVE") . range)(3)
[('L', 0), ('O', 1), ('V', 2)]
>>> zip("LOVE", range(3)) | collect
[('L', 0), ('O', 1), ('V', 2)]

>>> (collect . uncurry(zip))(("LOVE", range(3),))
[('L', 0), ('O', 1), ('V', 2)]
>>> ("LOVE", range(3)) | uncurry(zip) | collect
[('L', 0), ('O', 1), ('V', 2)]
```

### Lazy Evaluation: `lazy` and `force`
- `lazy` defers the evaluation of a function(or expression) and returns the _deferred expression_.
- `force` forces the deferred-expression to be fully evaluated when needed.
  > it reminds `Haskell`'s `force x = deepseq x x`.

> `lazy(fn, *args, **kwargs)`  
> `force(EXPR)`  
> `mforce([EXPR])`  

```python
# strictly generate a random integer between [1, 10)
>>> randint(1, 10)

# generate a lazy expression for the above
>>> deferred = lazy(randint, 1, 10)

# evaluate it when it need
>>> force(deferred)

# the same as above
>>> deferred()
```

Are those evaluations with `lazy` really deferred?

```python
>>> long_list = randint(1, 100000, 100000)    # a list of one million random integers

>>> %timeit sort(long_list)
142 ms ± 245 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)

# See the evaluation was deferred
>>> %timeit lazy(sort, long_list)
1.03 µs ± 2.68 ns per loop (mean ± std. dev. of 7 runs, 1,000,000 loops each
```

#### when to use
For given a function `randint(low, high)`, how can we generate a list of random integers?

```python
[ randint(1, 10) for _ in range(5) ]    # exactly the same as 'randint(1, 10, 5)'
```

It's the simplest way but what about using `replicate`?
```python
# generate a list of random integers using 'replicate'
>>> replicate(5, randint(1, 10))
[7, 7, 7, 7, 7]        # ouch, duplication of the first evaluated item.
```
Wrong! This result is definitely not what we want. We need to defer the function evaluation till it is _replicated_.

Just use `lazy(randint, 1, 10)` instead of `randint(1, 10)`

```python
# replicate 'deferred expression'
>>> randos = replicate(5, lazy(randint, 1, 10))

# evaluate when needed
>>> mforce(randos)      # mforce = map(force), map 'force' over deferred expressions
[6, 2, 5, 1, 9]         # exactly what we wanted
```

Here is the simple secret: if you complete `f_` or `ff_` with a function name and its arguments, and leave it unevaluated (not called), they will act as a _deferred expression_.

Not related to `lazy` operation, but you do the same thing with `uncurry`

```python
# replicate the tuple of arguments (1, 10) and then apply to uncurried function
>>> map(u_(randint))(replicate(5, (1,10)))    # u_ == uncurry
[7, 6, 1, 7, 2]
```

### Raise and assert with _expressions_: `error` and `guard`

Raise any kinds of exception in `lambda` expression as well.

> `error(MESSAGE, e=EXCEPTION_TO_RAISE)`    
```python
>>> error("Error, used wrong type", e=TypeError)

>>> error("out of range", e=IndexError)

>>> (lambda x: x if x is not None else error("Error, got None", e=ValueError))(None)
```
Likewise, use `guard` if there need _assertion_ not as a statement, but as an _expression_.

> `guard(PREDICATE, MESSAGE, e=EXCEPTION_TO_RAISE)` 
```python

>>> guard("Almost" == "enough", "'Almost' is never 'enough'")

>>> guard(rand() > 0.5, "Assertion error occurs with a 0.5 probability")

>>> guard(len(x := range(11)) == 10, f"length is not 10: {len(x)}")
```

### Exception catcher builder: `trap`
`trap` is a decorator factory that creates exception catchers. `e` indicates error types you want to catch, `callback` is a callback function to invoke with the catched error.  
This is very useful when handling exceptions with a functional approach on a _function-by-function basis_

> `trap(callback, e=None)`

This will catch `ValueError` and then `print` the error message.
```python
>>> trap(print, e=ValueError)(error)(msg="occured a value-error", e=ValueError)
Occured a value-error
```
This will catch all kinds of errors, then count the length of the error message when calling `func(*args, **kwargs)`.  
```python
trap(cf(len, str), e=None)(func)(*args, **kwargs)
```

This function will never throw errors. It return only `None` instead of raising exceptions.

```python
@trap(callback=void, e=None)
def func(*args, **kwargs):
    ...
```



## Utilities
### Flatten iterables: `flat` and `flatten`

`flat` completely removes all nesting levels. (_deep flatten_)  
`flatten`, on the other hand, reduces the nesting depth by the given level. (_swallow flatten_)  
_String-like iterables_ such as `str`, `bytes`, and `bytearray` are not flattened.


> `flat(*args)`   
> `flatl(*args)`  
> `flatten(ITERABLE, d=LEVEL)`  

```python
>>> data = [1,2,[3,4,[[[5],6],7,{8},((9),10)],range(11,13)], (x for x in [13,14,15])]

>>> flat(data) | collect    
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]

>>> data = [1, [[2,{3}]], [[[[4]],5]], (('sofia','maria'),)]  
>>> flatten(data)  # by default, d=1
[1, [2, {3}], [[[4]], 5], ('sofia', 'maria')]

>>> flatten(data d=2) 
[1, 2, {3}, [[4]], 5, 'sofia', 'maria']

>>> flatl(data)    # flatl(data) == flat(data) | collect  
[1, 2, 3, 4, 5, 'sofia', 'maria']
```
### Shell Command: `shell`
`shell` executes shell commands _synchronosly_ and _asynchronously_ and capture their outputs.

> `shell(CMD, sync=True, o=True, *, executable="/bin/bash")`

```
  --------------------------------------------------------------------
    o-value  |  return |  meaning
  --------------------------------------------------------------------
     o =  1  |  [str]  |  captures stdout/stderr (2>&1)
     o = -1  |  None   |  discard (&>/dev/null)
  otherwise  |  None   |  do nothing or redirection (2>&1 or &>FILE)
  --------------------------------------------------------------------
```
`shell` performs the same operation as `ipython`'s magic command `!!`. However, it can also be used within a `python` script.

```python
>>> output = shell("ls -1 ~")    
>>> output = "ls -1 ~" | shell           # the same

>>> shell("find . | sort" o=-1)          # run and discard the result  
>>> "find . | sort" | shell(o=-1)    

>>> shell("cat *.md", o=writer(FILE))    # redirect to FILE
>>> "cat *.md" | shell(o=writer(FILE))   # redirect to FILE
```
### Neatify data structures: `neatly` and `nprint`
`neatly` generates neatly formatted string of the complex data structures of `dict` and `list`.

`nprint` (_neatly-print_) prints data structures to `stdout` using `neatly` formatter.   
`nprint(...)` = `print(neatly(...))`  

> `nprint(DICT, _cols=INDENT, _width=WRAP, _repr=BOOL, **kwargs)`

```python
>>> o = {
...   "$id": "https://example.com/enumerated-values.schema.json",
...   "$schema": "https://json-schema.org/draft/2020-12/schema",
...   "title": "Enumerated Values",
...   "type": "object",
...   "properties": {
...     "data": {
...       "enum": [42, True, "hello", None, [1, 2, 3]]
...     }
...   }
... }

>>> nprint(o)
       $id  |  'https://example.com/enumerated-values.schema.json'
   $schema  |  'https://json-schema.org/draft/2020-12/schema'
properties  |  data  |  enum  +  42
            :        :        -  True
            :        :        -  'hello'
            :        :        -  None
            :        :        -  +  1
            :        :           -  2
            :        :           -  3
     title  |  'Enumerated Values'
      type  |  'object'
```

### Dot-accessible dictionary: `dmap`
`dmap` is a _yet another_ `dict`. It's exactly the same as `dict` but it enables to access its nested structure with '_dot notations_'.

> `dmap(DICT, **kwargs)`

```python
>>> d = dmap()    # empty dict

>>> o = dict(name="yunchan lim", age=19)
>>> d = dmap(o, profession="pianist")    

>>> d = dmap(name="yunchan lim", age=19, profession="pianist")    # the same

# just put the value in the desired keypath
>>> d.cliburn.semifinal.mozart = "piano concerto no.22"
>>> d.cliburn.semifinal.liszt = "12 transcendental etudes"
>>> d.cliburn.final.beethoven = "piano concerto no.3"
>>> d.cliburn.final.rachmaninoff = "piano concerto no.3"
>>> nprint(d)
       age  |  19
   cliburn  |      final  |     beethoven  |  'piano concerto no.3'
            :             :  rachmaninoff  |  'piano concerto no.3'
            :  semifinal  |   liszt  |  '12 transcendental etudes'
            :             :  mozart  |  'piano concerto no.22'
      name  |  'yunchan lim'
profession  |  'pianist'
```
```python
>>> del d.cliburn.semifinal
>>> d.profession = "one-in-a-million talent"
>>> nprint(d)
       age  |  19
   cliburn  |  final  |     beethoven  |  'piano concerto no.3'
            :         :  rachmaninoff  |  'piano concerto no.3'
      name  |  'yunchan lim'
profession  |  'one-in-a-million talent'
```
```python
# No such keypath
>>> d.bach.chopin.beethoven
{}
```
### Handy File Tools: `ls` and `grep` 
Use `ls` and `grep` in the same way you use in your terminal every day.   
_This is just a more intuitive alternative to_ `os.listdir` and `os.walk`. When applicable, use `shell` instead. 

> `ls(*paths, grep=REGEX, i=BOOL, r=BOOL, f=BOOL, d=BOOL, g=BOOL)`
```python
# couldn't be simpler!
>>> ls()       # the same as ls("."): get contents of the curruent dir

# expands "~" automatically
>>> ls("~")    # the same as `ls -a1 ~`: returns a list of $HOME

# support glob patterns (*, ?, [)
>>> ls("./*/*.py")

# with multiple filepaths
>>> ls(FILE, DIR, ...)
```
```python
# list up recursively and filter hidden files out
>>> ls(".git", r=True, grep="^[^\.]")
```
```python
# only files in '.git' directory
>>> ls(".git", r=True, f=True)

# only directories in '.git' directory
>>> ls(".git", r=True, d=True)
```
```python
# search recursivley and matching a pattern with `grep`
>>> ls(".", r=True, i=True, grep=".Py")    # 'i=True' for case-insensitive grep pattern
```
```
[ ..
 '.pytest_cache/v/cache/stepwise',
 'foc/__init__.py',
 'foc/__pycache__/__init__.cpython-310.pyc',
 'tests/__init__.py',
.. ]
```
```python
# regex patterns come in
>>> ls(".", r=True, grep=".py$")
```
```
['foc/__init__.py', 'setup.py', 'tests/__init__.py', 'tests/test_foc.py']
```
```python
# that's it!
>>> ls(".", r=True, grep="^(foc).*py$")

# the same as above
>>> ls("foc/*.py")
```
```
['foc/__init__.py']
```

`grep` build a filter to select items matching `REGEX` pattern from _iterables_.
> `grep(REGEX, i=BOOL)`

```python
# 'grep' builds filter with regex patterns
>>> grep(r"^(foc).*py$")(ls(".", r=True))
```
```
['foc/__init__.py']
```
_See also_: `HOME`, `cd`, `pwd`, `mkdir`, `rmdir`, `exists`, `dirname`, and `basename`.

### Flexible Progress Bar: `taskbar`

`taskbar` makes it easy to do progress bar related tasks. Acutally `taskbar` is the same as the `rich.progress` except for below:

- _No install required_
  > `taskbar` use `pip`'s bundle. `pip` is already installed almost everywhere.
- Fixed to default _`tqdm`-like bar style_
- _Simplified further_ the `rich.progress`'s usage

> `taskbar(x=None, desc="working", *, start=0, total=None, barcolor="white", **kwargs)`  
> _See also_: `rich.progress.Progress(.., **kwargs)`

```python
# simply with iterables (or generators with 'total=LENGTH')
>>> for _ in taskbar(range(100), "[cyan] training model"):
...    ...   
 training model  100%  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━  100/100  0:00:20 < 0:00:00  4.94 it/s

# when staring in the middle of a progress
>>> for _ in taskbar(range(100), "[cyan] training model", start=30):
...     ... 

# manual update with multiple tasks
>>> with taskbar() as tb:
...     task1 = tb.add_task("[red] fine-tuning", total=1000)
...     task2 = tb.add_task("[green] train-critic", total=1000)
...     task3 = tb.add_task("[cyan] reinforce", total=1000)
...     while not tb.finished:
...         ...
...         tb.update(task1, advance=0.9)
...         tb.update(task2, advance=0.5)
...         tb.update(task3, advance=0.1)
...
 fine-tuning   18%  ━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━   178/1000  0:00:20 < 0:01:34  8.79 it/s
 train-critic  10%  ━━━╸━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━    99/1000  0:00:20 < 0:03:05  4.88 it/s
 reinforce      6%  ━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━    59/1000  0:00:20 < 0:05:22  2.93 it/s
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/thyeem/foc",
    "name": "foc",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "functional functools functional-python",
    "author": "Francis Lim",
    "author_email": "thyeem@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/d9/72/5654cad12c536c9fd7e6b4ff082e8bad93352ad1aeb624dba76a83380afe/foc-0.4.6.tar.gz",
    "platform": null,
    "description": "# foc\n\n![foc](https://img.shields.io/pypi/v/foc)\n\n`fun oriented code` or `francis' odd collection`.\n\n\nFunctions from the `Python` standard library are great. But some notations are a bit painful and confusing for personal use, so I created this _odd collection of functions_.\n\n\n## Tl;dr\n\n- `foc` provides a collection of ___higher-order functions___, some helpful (_pure_) _functions_.\n- `foc` provides an easy way to ___compose functions with symbol___. (`.` and `|`)\n- `foc` respects the `python` standard library. _Never reinvented the wheel_.\n\n\n## How to use\n```bash\n# install\n$ pip install -U foc\n\n# import\n>>> from foc import *\n```\n\n## Ground rules\n- _No dependencies_ except for the `python` standard library\n- _No unnessary wrapping_ objects.\n- Most function implementations _should be less than 5-lines_.\n- Followed `haskell`-like function names and arguments order\n- Used `python` generator first if possible. (_lazy-evaluation_)\n  > `map`, `filter`, `zip`, `range`, `flat` ...\n\n- Provide the functions that unpack generators in `list` as well. \n- Function names that end in `l` indicate the result will be unpacked in a list.\n  > `mapl`, `filterl`, `zipl`, `rangel`, `flatl`, `takewhilel`, `dropwhilel`, ...\n- Function names that end in `_` indicate that the function is a _partial application_ builder.\n  > `cf_`, `f_`, `ff_`, `c_`, `cc_`, `u_`, ...\n\n## Quickstart\n\nLet's pick lottery numbers. That is to pick 6 numbers from 1 to 45. _The Lotto_ [_(korean lottery)_](https://en.lottolyzer.com/home/south-korea/6_slash_45-lotto)\n```python\n>>> range(1, 46) | choice(size=6) | sort\n[2, 8, 22, 24, 37, 39]\n\n# This is one game. People usually buy five games at once.\n\n>>> [ range(1, 46) | choice(size=6) | sort for _ in range(5) ]\n[[4, 6, 11, 38, 41, 45],\n [5, 8, 23, 25, 26, 40],\n [13, 18, 23, 25, 37, 44],\n [17, 21, 24, 32, 41, 43],\n [5, 9, 13, 25, 30, 38]]\n\n>>> gumballs = replicate(5, range(1, 46))     # 5-set of gumballs\n\n# in Unix pipelines manner\n>>> gumballs | map(choice(size=6)) | map(sort) | collect\n[[1, 3, 5, 10, 23, 41],\n [14, 18, 28, 33, 37, 39],\n [13, 15, 19, 23, 32, 45],\n [4, 11, 19, 27, 30, 39],\n [8, 33, 35, 39, 40, 41]]\n\n# with Haskell-like mathematical symbol\n>>> (collect . map(sort . fx(choice(size=6))))(gumballs)\n[[4, 14, 15, 28, 42, 44],\n [12, 34, 37, 40, 41, 42],\n [7, 10, 21, 26, 31, 39],\n [6, 11, 12, 14, 25, 32],\n [2, 13, 15, 26, 27, 41]]\n```\n\nThe functions `foc` provides are not particularly different. Exactly, it look like normal functions.  \n```python\n>>> id(\"francis\")\n'francis'\n\n>>> even(3)\nFalse\n\n>>> take(3, range(5, 10))\n[5, 6, 7]\n```\n\n`foc` just adds ways to __compose functions with symbols__\n| symbol          | description                     | evaluation order | Available functions            |\n|-----------------|---------------------------------|------------------|--------------------------------|\n| `.` (dot)       | same as the mathematical symbol | backwards        | all _globals_, all _built-ins_ |\n| `\\|` (pipeline) | in Unix pipeline manner         | in order         | `@fx`-_decorated functions_    |\n\n### Composition of Functions with `.`\n```python\n>>> (length . range)(10)\n10\n\n>>> (collect . filter(even) . range)(10)  # 'collect' unpacks generators\n[0, 2, 4, 6, 8]\n\n>>> (sum . map(f_(\"+\", 5)) . range)(10)   # f_(\"+\", 5) == lambda x: x+5\n95\n\n>>> (last . sort . shuffle . collect . range)(11)\n10\n```\n\nall functions in `globals()` including all `built-ins` can be direcly composed by `.`, except for __two__.\n\n\n- `lambda` \n- _partial application_ like: `partial(map, lambda x: x+5)`\n  > the same as `map(f_(\"+\", 5))`. They are interchangeable.   \n\nIn those case, __just wrap them in `fx`__.\n```python\n>>> (fx(lambda x: x+2) . fx(lambda x: x*8))(5)    # don't. fx(lambda *args, **kwargs: ...)\n42\n>>> (id . f_(\"+\", 2) . f_(\"*\", 8))(5)             # isn't it better?\n42\n\n>>> (sum . fx(partial(map, lambda x: x+5)))(range(5))   # don't partial(map, lambda ...)\n37\n>>> (sum . map(f_(\"+\", 5)))(range(5))                   # `map(f_(\"+\", 5))` is enough\n37\n\n>>> (unchars . map(chr))(range(73, 82))\n'IJKLMNOPQ'\n\n>>> (collect . map(pred . succ) . range)(5)\n[0, 1, 2, 3, 4]\n```\nBut, it's very tedious work wrapping partial `map` in `fx` every time. Thus, `f_`, `ff_`, `curry`, `uncurry`, `map`, and `filter` have been processed __so that they can be used without `fx`__. \n> _See also_: `f_`, `ff_`, `c_`, `cc_`, `u_`, `curry`, `uncurry`, `map`, and `filter`\n\n### Composition of Functions with `|`\n```python\n>>> range(10) | length\n10\n\n>>> range(10) | filter(even) | collect    # 'collect' unpacks generators\n[0, 2, 4, 6, 8]\n\n>>> range(10) | map(f_(\"+\", 5)) | sum     # f_(\"+\", 5) == lambda x: x+5\n95\n\n>>> rangel(11) | shuffle | sort | last\n10\n```\n\nUnlike the case of `.`, composing functions with `|` is allowed only for `composable` functions (or `fx` function). But don't worry. Most functions `foc` provides are the `fx` function.   \n\n`fx` functions (or _Function eXtension_) are `@fx`-decorated functions.\n> To list/get all available `fx` functions, call `catalog(fx=True)` or `lsfx()`.\n\n\nIf you want to make a function the `fx` function on the fly, __just wrap the function in `fx`__. \n```python\n>>> 7 | fx(lambda x: x * 6)\n42\n```\n\nTry binding a function to a new reference:\n```python\n>>> foo = fx(func)\n```\n\nor use `fx` decorator. All the same. \n```python\n>>> @fx               # from now on, arg | func == func(arg)\n... def func(arg):     \n...    ...\n```\n\n\n## Examples\nThese are part of the _symbol-composable_ functions `foc` provides.  \n> To list all available functions, call `catalog()`.\n\n### Basic (pure) functions \n```python\n>>> id(\"francis\")\n'francis'\n\n>>> const(5, \"no-matther-what-comes-here\")\n5\n\n>>> seq(\"only-returns-the-following-arg\", 5)\n5\n\n>>> void(randbytes(256))\n\n>>> fst([\"sofia\", \"maria\", \"claire\"])\n'sofia'\n\n>>> snd((\"sofia\", \"maria\", \"claire\"))\n'maria'\n\n>>> nth(3, [\"sofia\", \"maria\", \"claire\"]) \n'claire'\n\n>>> take(3, range(5, 10))\n[5, 6, 7]\n\n>>> drop(3, \"github\") | collect \n['h', 'u', 'b']\n\n>>> head(range(1,5))              # returns 'None' when []\n1\n\n>>> last(range(1,5))              # returns 'None' when []\n4\n\n>>> init(range(1,5)) | collect    # returns [] when []\n[1, 2, 3]\n\n>>> tail(range(1,5)) | collect    # returns [] when []\n[2, 3, 4]\n\n>>> pair(\"sofia\", \"maria\")\n('sofia', 'maria')\n\n>>> pred(3)\n2\n\n>>> succ(3)\n4\n\n>>> odd(3)\nTrue\n\n>>> even(3)\nFalse\n\n>>> null([]) == null(()) == null({}) == null(\"\")\nTrue\n\n>>> elem(5, range(10))\nTrue\n\n>>> not_elem(\"fun\", \"functions\")\nFalse\n\n>>> nub(\"3333-13-1111111\")\n['3', '-', '1']\n\n>>> chars(\"sofimarie\")\n['s', 'o', 'f', 'i', 'm', 'a', 'r', 'i', 'e']\n\n>>> unchars(['s', 'o', 'f', 'i', 'm', 'a', 'r', 'i', 'e'])\n'sofimarie'\n\n>>> words(\"fun on functions\")\n['fun', 'on', 'functions']\n\n>>> unwords(['fun', 'on', 'functions'])\n'fun on functions'\n\n>>> lines(\"fun\\non\\nfunctions\")\n['fun', 'on', 'functions']\n\n>>> unlines(['fun', 'on', 'functions'])\n\"fun\\non\\nfunctions\"\n\n>>> take(3, repeat(5))        # repeat(5) = [5, 5, ...]\n[5, 5, 5]\n\n>>> take(5, cycle(\"fun\"))     # cycle(\"fun\") = ['f', 'u', 'n', 'f', 'u', 'n', ...]\n['f', 'u', 'n', 'f', 'u']\n\n>>> replicate(3, 5)           # the same as 'take(3, repeat(5))'\n[5, 5, 5]\n\n>>> take(3, count(2))         # count(2) = [2, 3, 4, 5, ...]\n[2, 3, 4]\n\n>>> take(3, count(2, 3))      # count(2, 3) = [2, 5, 8, 11, ...]\n[2, 5, 8]\n```\n### Higher-order functions\n```python\n>>> flip(pow)(7, 3)                             # the same as `pow(3, 7) = 3 ** 7`\n2187\n\n>>> bimap(f_(\"+\", 3), f_(\"*\", 7), (5, 7))       # bimap (3+) (7*) (5, 7)\n(8, 49)                                         # (3+5, 7*7)\n\n>>> first(f_(\"+\", 3), (5, 7))                   # first (3+) (5, 7)\n(8, 7)                                          # (3+5, 7)\n\n>>> second(f_(\"*\", 7), (5, 7))                  # second (7*) (5, 7)\n(5, 49)                                         # (5, 7*7)\n\n>>> take(5, iterate(lambda x: x**2, 2))         # [2, 2**2, (2**2)**2, ((2**2)**2)**2, ...]\n[2, 4, 16, 256, 65536]\n\n>>> [* takewhile(even, [2, 4, 6, 1, 3, 5]) ]    \n[2, 4, 6]\n\n>>> takewhilel(even, [2, 4, 6, 1, 3, 5])\n[2, 4, 6]\n\n>>> [* dropwhile(even, [2, 4, 6, 1, 3, 5]) ]    \n[1, 3, 5]\n\n>>> dropwhilel(even, [2, 4, 6, 1, 3, 5])\n[1, 3, 5]\n\n# fold with a given initial value from the left\n>>> foldl(\"-\", 10, range(1, 5))                 # foldl (-) 10 [1..4]\n0\n\n# fold with a given initial value from the right\n>>> foldr(\"-\", 10, range(1, 5))                 # foldr (-) 10 [1..4]\n8\n\n# `foldl` without an initial value (used first item instead)\n>>> foldl1(\"-\", range(1, 5))                    # foldl1 (-) [1..4]\n-8\n\n# `foldr` without an initial value (used first item instead)\n>>> foldr1(\"-\", range(1, 5))                    # foldr1 (-) [1..4]\n-2\n\n# accumulate reduced values from the left\n>>> scanl(\"-\", 10, range(1, 5))                 # scanl (-) 10 [1..4]\n[10, 9, 7, 4, 0]\n\n# accumulate reduced values from the right\n>>> scanr(\"-\", 10, range(1, 5))                 # scanr (-) 10 [1..4]\n[8, -7, 9, -6, 10]\n\n# `scanl` but no starting value\n>>> scanl1(\"-\", range(1, 5))                    # scanl1 (-) [1..4]\n[1, -1, -4, -8]\n\n# `scanr` but no starting value\n>>> scanr1(\"-\", range(1, 5))                    # scanr1 (-) [1..4]\n[-2, 3, -1, 4]\n\n>>> concatl([\"sofia\", \"maria\"])\n['s', 'o', 'f', 'i', 'a', 'm', 'a', 'r', 'i', 'a']\n# Note that [\"sofia\", \"maria\"] = [['s','o','f','i','a'], ['m','a','r','i','a']]\n\n>>> concatmapl(str.upper, [\"sofia\", \"maria\"])   \n['S', 'O', 'F', 'I', 'A', 'M', 'A', 'R', 'I', 'A']\n```\n\n\n### Real-World Example\nA causal self-attention of the `transformer` model based on `pytorch` can be described as follows.  \n_Somebody_ insists that this helps to follow the process flow without distraction.\n\n```python\n    def forward(self, x):\n        B, S, E = x.size()  # size_batch, size_block (sequence length), size_embed\n        N, H = self.config.num_heads, E // self.config.num_heads  # E == (N * H)\n\n        q, k, v = self.c_attn(x).split(self.config.size_embed, dim=2)\n        q = q.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)\n        k = k.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)\n        v = v.view(B, S, N, H).transpose(1, 2)  # (B, N, S, H)\n\n        # Attention(Q, K, V)\n        #   = softmax( Q*K^T / sqrt(d_k) ) * V\n        #         // q*k^T: (B, N, S, H) x (B, N, H, S) -> (B, N, S, S)\n        #   = attention-prob-matrix * V\n        #         // prob @ v: (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)\n        #   = attention-weighted value (attention score)\n\n        return cf_(\n            self.dropout,  # dropout of layer's output\n            self.c_proj,  # linear projection\n            ff_(torch.Tensor.view, *rev(B, S, E)),  # (B, S, N, H) -> (B, S, E)\n            torch.Tensor.contiguous,  # contiguos in-memory tensor\n            ff_(torch.transpose, *rev(1, 2)),  # (B, S, N, H)\n            ff_(torch.matmul, v),  # (B, N, S, S) x (B, N, S, H) -> (B, N, S, H)\n            self.dropout_attn,  # attention dropout\n            ff_(torch.masked_fill, *rev(mask == 0, 0.0)),  # double-check masking\n            f_(F.softmax, dim=-1),  # softmax\n            ff_(torch.masked_fill, *rev(mask == 0, float(\"-inf\"))),  # no-look-ahead\n            ff_(\"/\", math.sqrt(k.size(-1))),  # / sqrt(d_k)\n            ff_(torch.matmul, k.transpose(-2, -1)),  # Q @ K^T -> (B, N, S, S)\n        )(q)\n```\n\n## In Detail\n### Get binary functions from `python` operators: `sym`\n`sym(OP)` converts `python`'s _symbolic operators_ into _binary functions_.  \nThe string forms of operators like `+`, `-`, `/`, `*`, `**`, `==`, `!=`, .. represent the corresponding binary functions.\n> To list all available symbol operators, call `sym()`.\n\n```python\n>>> sym(\"+\")(5, 2)                 # 5 + 2\n7\n\n>>> sym(\"==\")(\"sofia\", \"maria\")    # \"sofia\" == \"maria\"\nFalse\n\n>>> sym(\"%\")(123456, 83)           # 123456 % 83\n35\n```\n\n### Build partial application: `f_` and `ff_`\n- `f_` build left-associative partial application,  \nwhere the given function's arguments partially evaluation _from the left_.\n- `ff_` build right-associative partial application,  \nwhere the given function's arguments partially evaluation _from the right_.\n\n> `f_(fn, *args, **kwargs)`  \n> `ff_(fn, *args, **kwargs) == f_(flip(fn), *args, **kwargs)`  \n\n```python\n>>> f_(\"+\", 5)(2)    # the same as `(5+) 2` in Haskell\n7                    # 5 + 2\n\n>>> ff_(\"+\", 5)(2)   # the same as `(+5) 2 in Haskell`\n7                    # 2 + 5\n\n>>> f_(\"-\", 5)(2)    # the same as `(5-) 2`\n3                    # 5 - 2\n\n>>> ff_(\"-\", 5)(2)   # the same as `(subtract 5) 2`\n-3                   # 2 - 5\n```\n\n### Build curried functions: `c_` (`curry`) and `cc_`\n- `c_` is an alias for `curry`\n- `c_` takes the function's arguments _from the left_ \n- while `cc_` takes them _from the right_.\n\n> `c_(fn) == curry(fn)`   \n> `cc_(fn) == c_(flip(fn))`\n\nSee also `uncurry`\n\n```python\n# currying from the left args\n>>> c_(\"+\")(5)(2)    # 5 + 2\n7\n\n>>> c_(\"-\")(5)(2)    # 5 - 2\n3\n\n# currying from the right args\n>>> cc_(\"+\")(5)(2)   # 2 + 5\n7\n\n>>> cc_(\"-\")(5)(2)   # 2 - 5\n-3\n```\n\n### Build unary functions on a tuple: `u_` (`uncurry`)\n- `u_` is an alias for `uncurry`\n- `u_` converts a _normal function_ to __a unary function that takes a tuple of arguments__ only\n- `uncurry :: (a -> ... -> b -> o) -> (a, ..., b) -> o`\n\n```python\n>>> uncurry(pow)((2, 10))    # pow(2, 10)\n1024\n\n>>> (2, 3) | uncurry(\"+\")    # 2 + 3 or (+) 2 3 \n5\n\n>>> ([1, 3], [2, 4]) | uncurry(zip) | collect    # collect(zip([1, 3], [2, 4]))\n[(1, 2), (3, 4)]\n\n>>> (collect . uncurry(zip))(([1,3], [2,4],))     # the same\n[(1, 2), (3, 4)]\n```\n\n\n\n\n### Build composition of functions: `cf_` and `cfd`\n- `cf_` (_composition of function_) composes functions using the given list of functions. \n- `cfd` (_composing-function decorator_) decorates a function with the given list of functions.\n\n> `cf_(*fn, rep=None)`  \n> `cfd(*fn, rep=None)`\n\n```python\n>>> square = ff_(\"**\", 2)        # the same as (^2) in Haskell\n>>> add5 = ff_(\"+\", 5)           # the same as (+5) in Haskell\n>>> mul7 = ff_(\"*\", 7)           # the same as (*7) in Haskell\n\n>>> cf_(mul7, add5, square)(3)   # (*7) . (+5) . (^2) $ 3\n98                               # mul7(add5(square(3))) = ((3 ^ 2) + 5) * 7\n\n>>> cf_(square, rep=3)(2)        # cf_(square, square, square)(2) == ((2 ^ 2) ^ 2) ^ 2 = 256\n256\n\n\n>>> @cfd(mul7, add5, square)\n... def foo(x):\n...    return len(x)\n\n>>> foo([1,2,3])\n98\n\n# compare `cf_` with `cfd`\ncf_(a, b, c, d, f)(x)    # (a . b . c . d . f)(x) \n\ncfd(a, b, c, d)(f)(x)    # (a . b . c . d)(f(x))\n```\n\n`cfd` is very handy and useful to recreate previously defined functions by composing functions. All you need is to write a basic functions to do fundamental things.\n\n### Seamlessly extends: `map`, `filter` and `zip`\n- Extend usability while _maintaining full compatibility_\n- _No harm_ to existing usage. Just __added ways to compose function with symbols__\n\n> `map(fn, *xs)`   \n> `mapl(fn, *xs)`  \n```python\n>>> (collect . map(abs))(range(-2, 3)) \n[2, 1, 0, 1, 2]\n>>> map(abs)(range(-2, 3)) | collect\n[2, 1, 0, 1, 2]\n\n>>> (collect . map(lambda x: x*8))(range(1, 6))\n[8, 16, 24, 32, 40]\n>>> range(1, 6) | map(lambda x: x*8) | collect\n[8, 16, 24, 32, 40]\n\n>>> (collect . map(\"*\", [1, 2, 3]))([4, 5, 6])\n[4, 10, 18]\n>>> [4, 5, 6] | map(\"*\", [1, 2, 3]) | collect\n[4, 10, 18]\n```\n\n> `filter(p, xs)`  \n> `filterl(p, xs)`\n```python\n>>> (collect . filter(f_(\"==\", \"f\")))(\"fun-on-functions\")\n['f', 'f']\n>>> filter(f_(\"==\", \"f\"))(\"fun-on-functions\") | collect\n['f', 'f']\n\n>>> primes = [2, 3, 5, 7, 11, 13, 17, 19]\n>>> (collect . filter(lambda x: x % 3 == 2))(primes)\n[2, 5, 11, 17]\n>>> primes | filter(cf_(ff_(\"==\", 2), ff_(\"%\", 3))) | collect\n[2, 5, 11, 17]\n```\n\n> `zip(*xs, strict=False)`  \n> `zipl(*xs, strict=False)`  \n\n```python\n>>> (collect . f_(zip, \"LOVE\") . range)(3)\n[('L', 0), ('O', 1), ('V', 2)]\n>>> zip(\"LOVE\", range(3)) | collect\n[('L', 0), ('O', 1), ('V', 2)]\n\n>>> (collect . uncurry(zip))((\"LOVE\", range(3),))\n[('L', 0), ('O', 1), ('V', 2)]\n>>> (\"LOVE\", range(3)) | uncurry(zip) | collect\n[('L', 0), ('O', 1), ('V', 2)]\n```\n\n### Lazy Evaluation: `lazy` and `force`\n- `lazy` defers the evaluation of a function(or expression) and returns the _deferred expression_.\n- `force` forces the deferred-expression to be fully evaluated when needed.\n  > it reminds `Haskell`'s `force x = deepseq x x`.\n\n> `lazy(fn, *args, **kwargs)`  \n> `force(EXPR)`  \n> `mforce([EXPR])`  \n\n```python\n# strictly generate a random integer between [1, 10)\n>>> randint(1, 10)\n\n# generate a lazy expression for the above\n>>> deferred = lazy(randint, 1, 10)\n\n# evaluate it when it need\n>>> force(deferred)\n\n# the same as above\n>>> deferred()\n```\n\nAre those evaluations with `lazy` really deferred?\n\n```python\n>>> long_list = randint(1, 100000, 100000)    # a list of one million random integers\n\n>>> %timeit sort(long_list)\n142 ms \u00b1 245 \u00b5s per loop (mean \u00b1 std. dev. of 7 runs, 10 loops each)\n\n# See the evaluation was deferred\n>>> %timeit lazy(sort, long_list)\n1.03 \u00b5s \u00b1 2.68 ns per loop (mean \u00b1 std. dev. of 7 runs, 1,000,000 loops each\n```\n\n#### when to use\nFor given a function `randint(low, high)`, how can we generate a list of random integers?\n\n```python\n[ randint(1, 10) for _ in range(5) ]    # exactly the same as 'randint(1, 10, 5)'\n```\n\nIt's the simplest way but what about using `replicate`?\n```python\n# generate a list of random integers using 'replicate'\n>>> replicate(5, randint(1, 10))\n[7, 7, 7, 7, 7]        # ouch, duplication of the first evaluated item.\n```\nWrong! This result is definitely not what we want. We need to defer the function evaluation till it is _replicated_.\n\nJust use `lazy(randint, 1, 10)` instead of `randint(1, 10)`\n\n```python\n# replicate 'deferred expression'\n>>> randos = replicate(5, lazy(randint, 1, 10))\n\n# evaluate when needed\n>>> mforce(randos)      # mforce = map(force), map 'force' over deferred expressions\n[6, 2, 5, 1, 9]         # exactly what we wanted\n```\n\nHere is the simple secret: if you complete `f_` or `ff_` with a function name and its arguments, and leave it unevaluated (not called), they will act as a _deferred expression_.\n\nNot related to `lazy` operation, but you do the same thing with `uncurry`\n\n```python\n# replicate the tuple of arguments (1, 10) and then apply to uncurried function\n>>> map(u_(randint))(replicate(5, (1,10)))    # u_ == uncurry\n[7, 6, 1, 7, 2]\n```\n\n### Raise and assert with _expressions_: `error` and `guard`\n\nRaise any kinds of exception in `lambda` expression as well.\n\n> `error(MESSAGE, e=EXCEPTION_TO_RAISE)`    \n```python\n>>> error(\"Error, used wrong type\", e=TypeError)\n\n>>> error(\"out of range\", e=IndexError)\n\n>>> (lambda x: x if x is not None else error(\"Error, got None\", e=ValueError))(None)\n```\nLikewise, use `guard` if there need _assertion_ not as a statement, but as an _expression_.\n\n> `guard(PREDICATE, MESSAGE, e=EXCEPTION_TO_RAISE)` \n```python\n\n>>> guard(\"Almost\" == \"enough\", \"'Almost' is never 'enough'\")\n\n>>> guard(rand() > 0.5, \"Assertion error occurs with a 0.5 probability\")\n\n>>> guard(len(x := range(11)) == 10, f\"length is not 10: {len(x)}\")\n```\n\n### Exception catcher builder: `trap`\n`trap` is a decorator factory that creates exception catchers. `e` indicates error types you want to catch, `callback` is a callback function to invoke with the catched error.  \nThis is very useful when handling exceptions with a functional approach on a _function-by-function basis_\n\n> `trap(callback, e=None)`\n\nThis will catch `ValueError` and then `print` the error message.\n```python\n>>> trap(print, e=ValueError)(error)(msg=\"occured a value-error\", e=ValueError)\nOccured a value-error\n```\nThis will catch all kinds of errors, then count the length of the error message when calling `func(*args, **kwargs)`.  \n```python\ntrap(cf(len, str), e=None)(func)(*args, **kwargs)\n```\n\nThis function will never throw errors. It return only `None` instead of raising exceptions.\n\n```python\n@trap(callback=void, e=None)\ndef func(*args, **kwargs):\n    ...\n```\n\n\n\n## Utilities\n### Flatten iterables: `flat` and `flatten`\n\n`flat` completely removes all nesting levels. (_deep flatten_)  \n`flatten`, on the other hand, reduces the nesting depth by the given level. (_swallow flatten_)  \n_String-like iterables_ such as `str`, `bytes`, and `bytearray` are not flattened.\n\n\n> `flat(*args)`   \n> `flatl(*args)`  \n> `flatten(ITERABLE, d=LEVEL)`  \n\n```python\n>>> data = [1,2,[3,4,[[[5],6],7,{8},((9),10)],range(11,13)], (x for x in [13,14,15])]\n\n>>> flat(data) | collect    \n[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]\n\n>>> data = [1, [[2,{3}]], [[[[4]],5]], (('sofia','maria'),)]  \n>>> flatten(data)  # by default, d=1\n[1, [2, {3}], [[[4]], 5], ('sofia', 'maria')]\n\n>>> flatten(data d=2) \n[1, 2, {3}, [[4]], 5, 'sofia', 'maria']\n\n>>> flatl(data)    # flatl(data) == flat(data) | collect  \n[1, 2, 3, 4, 5, 'sofia', 'maria']\n```\n### Shell Command: `shell`\n`shell` executes shell commands _synchronosly_ and _asynchronously_ and capture their outputs.\n\n> `shell(CMD, sync=True, o=True, *, executable=\"/bin/bash\")`\n\n```\n  --------------------------------------------------------------------\n    o-value  |  return |  meaning\n  --------------------------------------------------------------------\n     o =  1  |  [str]  |  captures stdout/stderr (2>&1)\n     o = -1  |  None   |  discard (&>/dev/null)\n  otherwise  |  None   |  do nothing or redirection (2>&1 or &>FILE)\n  --------------------------------------------------------------------\n```\n`shell` performs the same operation as `ipython`'s magic command `!!`. However, it can also be used within a `python` script.\n\n```python\n>>> output = shell(\"ls -1 ~\")    \n>>> output = \"ls -1 ~\" | shell           # the same\n\n>>> shell(\"find . | sort\" o=-1)          # run and discard the result  \n>>> \"find . | sort\" | shell(o=-1)    \n\n>>> shell(\"cat *.md\", o=writer(FILE))    # redirect to FILE\n>>> \"cat *.md\" | shell(o=writer(FILE))   # redirect to FILE\n```\n### Neatify data structures: `neatly` and `nprint`\n`neatly` generates neatly formatted string of the complex data structures of `dict` and `list`.\n\n`nprint` (_neatly-print_) prints data structures to `stdout` using `neatly` formatter.   \n`nprint(...)` = `print(neatly(...))`  \n\n> `nprint(DICT, _cols=INDENT, _width=WRAP, _repr=BOOL, **kwargs)`\n\n```python\n>>> o = {\n...   \"$id\": \"https://example.com/enumerated-values.schema.json\",\n...   \"$schema\": \"https://json-schema.org/draft/2020-12/schema\",\n...   \"title\": \"Enumerated Values\",\n...   \"type\": \"object\",\n...   \"properties\": {\n...     \"data\": {\n...       \"enum\": [42, True, \"hello\", None, [1, 2, 3]]\n...     }\n...   }\n... }\n\n>>> nprint(o)\n       $id  |  'https://example.com/enumerated-values.schema.json'\n   $schema  |  'https://json-schema.org/draft/2020-12/schema'\nproperties  |  data  |  enum  +  42\n            :        :        -  True\n            :        :        -  'hello'\n            :        :        -  None\n            :        :        -  +  1\n            :        :           -  2\n            :        :           -  3\n     title  |  'Enumerated Values'\n      type  |  'object'\n```\n\n### Dot-accessible dictionary: `dmap`\n`dmap` is a _yet another_ `dict`. It's exactly the same as `dict` but it enables to access its nested structure with '_dot notations_'.\n\n> `dmap(DICT, **kwargs)`\n\n```python\n>>> d = dmap()    # empty dict\n\n>>> o = dict(name=\"yunchan lim\", age=19)\n>>> d = dmap(o, profession=\"pianist\")    \n\n>>> d = dmap(name=\"yunchan lim\", age=19, profession=\"pianist\")    # the same\n\n# just put the value in the desired keypath\n>>> d.cliburn.semifinal.mozart = \"piano concerto no.22\"\n>>> d.cliburn.semifinal.liszt = \"12 transcendental etudes\"\n>>> d.cliburn.final.beethoven = \"piano concerto no.3\"\n>>> d.cliburn.final.rachmaninoff = \"piano concerto no.3\"\n>>> nprint(d)\n       age  |  19\n   cliburn  |      final  |     beethoven  |  'piano concerto no.3'\n            :             :  rachmaninoff  |  'piano concerto no.3'\n            :  semifinal  |   liszt  |  '12 transcendental etudes'\n            :             :  mozart  |  'piano concerto no.22'\n      name  |  'yunchan lim'\nprofession  |  'pianist'\n```\n```python\n>>> del d.cliburn.semifinal\n>>> d.profession = \"one-in-a-million talent\"\n>>> nprint(d)\n       age  |  19\n   cliburn  |  final  |     beethoven  |  'piano concerto no.3'\n            :         :  rachmaninoff  |  'piano concerto no.3'\n      name  |  'yunchan lim'\nprofession  |  'one-in-a-million talent'\n```\n```python\n# No such keypath\n>>> d.bach.chopin.beethoven\n{}\n```\n### Handy File Tools: `ls` and `grep` \nUse `ls` and `grep` in the same way you use in your terminal every day.   \n_This is just a more intuitive alternative to_ `os.listdir` and `os.walk`. When applicable, use `shell` instead. \n\n> `ls(*paths, grep=REGEX, i=BOOL, r=BOOL, f=BOOL, d=BOOL, g=BOOL)`\n```python\n# couldn't be simpler!\n>>> ls()       # the same as ls(\".\"): get contents of the curruent dir\n\n# expands \"~\" automatically\n>>> ls(\"~\")    # the same as `ls -a1 ~`: returns a list of $HOME\n\n# support glob patterns (*, ?, [)\n>>> ls(\"./*/*.py\")\n\n# with multiple filepaths\n>>> ls(FILE, DIR, ...)\n```\n```python\n# list up recursively and filter hidden files out\n>>> ls(\".git\", r=True, grep=\"^[^\\.]\")\n```\n```python\n# only files in '.git' directory\n>>> ls(\".git\", r=True, f=True)\n\n# only directories in '.git' directory\n>>> ls(\".git\", r=True, d=True)\n```\n```python\n# search recursivley and matching a pattern with `grep`\n>>> ls(\".\", r=True, i=True, grep=\".Py\")    # 'i=True' for case-insensitive grep pattern\n```\n```\n[ ..\n '.pytest_cache/v/cache/stepwise',\n 'foc/__init__.py',\n 'foc/__pycache__/__init__.cpython-310.pyc',\n 'tests/__init__.py',\n.. ]\n```\n```python\n# regex patterns come in\n>>> ls(\".\", r=True, grep=\".py$\")\n```\n```\n['foc/__init__.py', 'setup.py', 'tests/__init__.py', 'tests/test_foc.py']\n```\n```python\n# that's it!\n>>> ls(\".\", r=True, grep=\"^(foc).*py$\")\n\n# the same as above\n>>> ls(\"foc/*.py\")\n```\n```\n['foc/__init__.py']\n```\n\n`grep` build a filter to select items matching `REGEX` pattern from _iterables_.\n> `grep(REGEX, i=BOOL)`\n\n```python\n# 'grep' builds filter with regex patterns\n>>> grep(r\"^(foc).*py$\")(ls(\".\", r=True))\n```\n```\n['foc/__init__.py']\n```\n_See also_: `HOME`, `cd`, `pwd`, `mkdir`, `rmdir`, `exists`, `dirname`, and `basename`.\n\n### Flexible Progress Bar: `taskbar`\n\n`taskbar` makes it easy to do progress bar related tasks. Acutally `taskbar` is the same as the `rich.progress` except for below:\n\n- _No install required_\n  > `taskbar` use `pip`'s bundle. `pip` is already installed almost everywhere.\n- Fixed to default _`tqdm`-like bar style_\n- _Simplified further_ the `rich.progress`'s usage\n\n> `taskbar(x=None, desc=\"working\", *, start=0, total=None, barcolor=\"white\", **kwargs)`  \n> _See also_: `rich.progress.Progress(.., **kwargs)`\n\n```python\n# simply with iterables (or generators with 'total=LENGTH')\n>>> for _ in taskbar(range(100), \"[cyan] training model\"):\n...    ...   \n training model  100%  \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501  100/100  0:00:20 < 0:00:00  4.94 it/s\n\n# when staring in the middle of a progress\n>>> for _ in taskbar(range(100), \"[cyan] training model\", start=30):\n...     ... \n\n# manual update with multiple tasks\n>>> with taskbar() as tb:\n...     task1 = tb.add_task(\"[red] fine-tuning\", total=1000)\n...     task2 = tb.add_task(\"[green] train-critic\", total=1000)\n...     task3 = tb.add_task(\"[cyan] reinforce\", total=1000)\n...     while not tb.finished:\n...         ...\n...         tb.update(task1, advance=0.9)\n...         tb.update(task2, advance=0.5)\n...         tb.update(task3, advance=0.1)\n...\n fine-tuning   18%  \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u257a\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501   178/1000  0:00:20 < 0:01:34  8.79 it/s\n train-critic  10%  \u2501\u2501\u2501\u2578\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501    99/1000  0:00:20 < 0:03:05  4.88 it/s\n reinforce      6%  \u2501\u2501\u257a\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501    59/1000  0:00:20 < 0:05:22  2.93 it/s\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A collection of python functions for somebody's sanity",
    "version": "0.4.6",
    "project_urls": {
        "Homepage": "https://github.com/thyeem/foc"
    },
    "split_keywords": [
        "functional",
        "functools",
        "functional-python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "878f035e81c219582cc2049b0e3886bbd54babcf68355e100f72eaff75f91230",
                "md5": "83f34e0a416178dcd8cad1da319f97e6",
                "sha256": "65055cb76a8e88c829532c0e01eb685bce7467e6ebf1a01eeb054f1497781c8f"
            },
            "downloads": -1,
            "filename": "foc-0.4.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "83f34e0a416178dcd8cad1da319f97e6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 29044,
            "upload_time": "2024-01-16T06:26:38",
            "upload_time_iso_8601": "2024-01-16T06:26:38.613996Z",
            "url": "https://files.pythonhosted.org/packages/87/8f/035e81c219582cc2049b0e3886bbd54babcf68355e100f72eaff75f91230/foc-0.4.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d9725654cad12c536c9fd7e6b4ff082e8bad93352ad1aeb624dba76a83380afe",
                "md5": "c538ea331b027d95661a6be6a78109cd",
                "sha256": "37421750de6c4ccc0901b5c5e1732052ea1f2295167bfdd188a8d5f878fb4cbb"
            },
            "downloads": -1,
            "filename": "foc-0.4.6.tar.gz",
            "has_sig": false,
            "md5_digest": "c538ea331b027d95661a6be6a78109cd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 39725,
            "upload_time": "2024-01-16T06:26:40",
            "upload_time_iso_8601": "2024-01-16T06:26:40.928123Z",
            "url": "https://files.pythonhosted.org/packages/d9/72/5654cad12c536c9fd7e6b4ff082e8bad93352ad1aeb624dba76a83380afe/foc-0.4.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-16 06:26:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "thyeem",
    "github_project": "foc",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "lcname": "foc"
}
        
Elapsed time: 0.17307s