Name | sl-experiment JSON |
Version |
3.0.0
JSON |
| download |
home_page | None |
Summary | Provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab. |
upload_time | 2025-07-20 19:28:27 |
maintainer | None |
docs_url | None |
author | Ivan Kondratyev, Kushaan Gupta, Natalie Yeung, Katlynn Ryu, Jasmine Si |
requires_python | <3.12,>=3.11 |
license | GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>. |
keywords |
acquisition
ataraxis
data
experiment
interface
mesoscope
sunlab
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# sl-experiment
A Python library that provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab.


[](https://github.com/astral-sh/uv)
[](https://github.com/astral-sh/ruff)




___
## Detailed Description
This library functions as the central hub for collecting and preprocessing the data for all current and future projects
in the Sun lab. To do so, it exposes the API to interface with all data acquisition systems in the lab. Primarily, this
relies on specializing various general-purpose libraries released as part of the 'Ataraxis' science-automation project
to work within the specific hardware implementations available in the lab.
This library is explicitly designed to work with the specific hardware and data handling strategies used in the Sun lab
and will likely not work in other contexts without extensive modification. The library broadly consists of two
parts: the shared assets and the acquisition-system-specific bindings. The shared assets are reused by all acquisition
systems and are mostly inherited from the [sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library.
The acquisition-system-specific code is tightly integrated with the hardware used in the lab and is generally not
designed to be reused in any other context. See the [data acquisition systems](#data-acquisition-systems) section for
more details on currently supported acquisition systems.
___
## Table of Contents
- [Installation](#installation)
- [Data Acquisition Systems](#data-acquisition-systems)
- [Mesoscope-VR System](#Mesoscope-vr-data-acquisition-system)
- [Acquired Data Structure and Management](#acquired-data-structure-and-management)
- [Acquiring Data in the Sun Lab](#acquiring-data-in-the-sun-lab)
- [API Documentation](#api-documentation)
- [Recovering from Interruptions](#recovering-from-interruptions)
- [Versioning](#versioning)
- [Authors](#authors)
- [License](#license)
- [Acknowledgements](#Acknowledgments)
---
## Installation
### Source
Note, installation from source is ***highly discouraged*** for anyone who is not an active project developer.
1. Download this repository to your local machine using your preferred method, such as Git-cloning. Use one
of the stable releases from [GitHub](https://github.com/Sun-Lab-NBB/sl-experiment/releases).
2. Unpack the downloaded zip and note the path to the binary wheel (`.whl`) file contained in the archive.
3. Run ```python -m pip install WHEEL_PATH```, replacing 'WHEEL_PATH' with the path to the wheel file, to install the
wheel into the active python environment.
### pip
Use the following command to install the library using pip: ```pip install sl-experiment```.
___
## Data Acquisition Systems
A data acquisition (and runtime control) system can be broadly defined as a collection of hardware and software tools
used to conduct training or experiment sessions that acquire scientific data. Each data acquisition system can use one
or more machines (PCs) to acquire the data, with this library (sl-experiment) typically running on the **main** data
acquisition machine. Additionally, each system typically uses a Network Attached Storage (NAS), a remote storage server,
or both to store the data after the acquisition safely (with redundancy and parity).
In the Sun lab, each data acquisition system is built around the main tool used to acquire the brain activity data. For
example, the main system in the Sun lab is the [Mesoscope-VR](#Mesoscope-vr-data-acquisition-system) system, which uses
the [2-Photon Random Access Mesoscope (2P-RAM)](https://elifesciences.org/articles/14472). All other components of that
system are built around the Mesoscope to facilitate the acquisition of the brain activity data. Due to this inherent
specialization, each data acquisition system in the lab is treated as an independent unit that requires custom software
to acquire, preprocess, and process the resultant data.
***Note!*** Since each data acquisition system is unique, the section below will be iteratively expanded to include
system-specific assembly instructions for **each supported acquisition system**. Commonly, updates to this section
coincide with major or minor library version updates.
---
## Mesoscope-VR Data Acquisition System
This is the main data acquisition system currently used in the Sun lab. The system broadly consists of four major
parts:
1. The [2-Photon Random Access Mesoscope (2P-RAM)](https://elifesciences.org/articles/14472), assembled by
[Thor Labs](https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=10646) and controlled by
[ScanImage](https://www.mbfbioscience.com/products/scanimage/) software. The Mesoscope control and data acquisition
are performed by a dedicated computer referred to as the **'ScanImagePC'** or, (less frequently) the
**'Mesoscope PC'**. This PC is assembled and configured by the [MBF Bioscience](https://www.mbfbioscience.com/). The
only modification carried out by the Sun lab during assembly was the configuration of a Server Message Block (SMB)
protocol access to the root folder used by the ScanImage software to save the Mesoscope data.
2. The [Unity game engine](https://unity.com/products/unity-engine) running the Virtual Reality game world used in all
experiments to control the task environment and resolve the task logic. The virtual environment runs on the main data
acquisition computer referred to as the **'VRPC'** and relies on the [MQTT](https://mqtt.org/) communication protocol
and the [Sun lab implementation of the GIMBL package](https://github.com/Sun-Lab-NBB/Unity-tasks) to bidirectionally
interface with the virtual task environment.
3. The [microcontroller-powered hardware](https://github.com/Sun-Lab-NBB/sl-micro-controllers) that allows the animal
to bidirectionally interface with various physical components (modules) of the Mesoscope-VR systems.
4. A set of visual and IR-range cameras, used to acquire behavior video data.
### Main Dependency
- ***Linux*** operating system. While the library *may* also work on Windows and (less likely) macOS, it has been
explicitly written for and tested on the mainline [6.11 kernel](https://kernelnewbies.org/Linux_6.11) and
Ubuntu 24.10 distribution of the GNU Linux operating system using [Wayland](https://wayland.freedesktop.org/) window
system architecture.
### Software Dependencies
***Note!*** This list only includes *external dependencies*, which are installed *in addition* to all
dependencies automatically installed from pip / conda as part of library installation. The dependencies below have to
be installed and configured on the **VRPC** before calling runtime commands via the command-line interface (CLI) exposed
by this library.
- [MQTT broker](https://mosquitto.org/) version **2.0.21**. The broker should be running locally and can use
the **default** IP (27.0.0.1) and Port (1883) configuration.
- [FFMPEG](https://www.ffmpeg.org/download.html). As a minimum, the version of FFMPEG should support H265 and H264
codecs with hardware acceleration (Nvidia GPU). This library was tested with the version **7.1.1-1ubuntu1.1**.
- [MvImpactAcquire](https://assets-2.balluff.com/mvIMPACT_Acquire/). This library is tested with version **2.9.2**,
which is freely distributed. Higher GenTL producer versions will likely work too, but they require purchasing a
license.
- [Zaber Launcher](https://software.zaber.com/zaber-launcher/download) version **2025.6.2-1**.
- [Unity Game Engine](https://unity.com/products/unity-engine) version **6000.1.12f1**.
### Hardware Dependencies
**Note!** These dependencies only apply to the **VRPC**. Hardware dependencies for the **ScanImagePC** are determined
and controlled by MBF and ThorLabs. This library benefits from the **ScanImagePC** being outfitted with a 10-GB network
card, but this is not a strict requirement.
- [Nvidia GPU](https://www.nvidia.com/en-us/). This library uses GPU hardware acceleration to encode acquired video
data. Any Nvidia GPU with hardware encoding chip(s) should work as expected. The library was tested with
[RTX 4090](https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/).
- A CPU with at least 12, preferably 16, physical cores. This library has been tested with
[AMD Ryzen 7950X CPU](https://www.amd.com/en/products/processors/desktops/ryzen/7000-series/amd-ryzen-9-7950x.html).
It is recommended to use CPUs with 'full' cores, instead of the modern Intel’s design of 'e' and 'p' cores
for predictable performance of all library components.
- A 10-Gigabit capable motherboard or Ethernet adapter, such as [X550-T2](https://shorturl.at/fLLe9). Primarily, this is
required for the high-quality machine vision camera used to record videos of the animal’s face. The 10-Gigabit lines
are also used for transferring the data between the PCs used in the data acquisition process and destination machines
used for long-term data storage (see [acquired data management section](#acquired-data-structure-and-management) for
more details).
### System Assembly
The Mesoscope-VR system consists of multiple interdependent components. We are constantly making minor changes to the
system to optimize its performance and facilitate novel experiments and projects carried out in the lab. Treat this
section as a general system composition guide, but consult lab publications over this section for instructions on
building specific system implementations used to acquire the data featured in different publications.
Physical assembly and mounting of ***all*** hardware components mentioned in the specific subsections below is discussed
in the [main Mesoscope-VR assembly section](#Mesoscope-vr-assembly).
### Zaber Motors
All brain activity recordings with the Mesoscope require the animal to be head-fixed. To orient head-fixed animals on
the Virtual Reality treadmill (running wheel) and promote task performance, the system uses three groups of motors
controlled through Zaber motor controllers. The first group, the **HeadBar**, is used to position the animal’s head in
Z, Pitch, and Roll axes. Together with the movement axes of the Mesoscope, this allows for a wide range of
motions necessary to align the Mesoscope objective with the brain imaging plane. The second group of
motors, the **LickPort**, controls the position of the water delivery port (tube) (and sensor) in X, Y, and Z axes. This
is used to ensure all animals have comfortable access to the water delivery tube, regardless of their head position.
The third group of motors, the **Wheel**, controls the position of the running wheel in the X-axis relative to the
head-fixed animal’s body and is used to position the animal on the running wheel to promote good running behavior.
The current snapshot of Zaber motor configurations used in the lab, alongside the motor parts list and electrical wiring
instructions is available
[here](https://drive.google.com/drive/folders/1SL75KE3S2vuR9TTkxe6N4wvrYdK-Zmxn?usp=drive_link).
**Warning!** Zaber motors have to be configured correctly to work with this library. To (re)configure the motors to work
with the library, apply the setting snapshots from the link above via the
[Zaber Launcher](https://software.zaber.com/zaber-launcher/download) software. Make sure you read the instructions in
the 'Applying Zaber Configuration' document for the correct application procedure.
**Although this is highly discouraged, you can also edit the motor settings manually**. To configure the motors
to work with this library, you need to overwrite the non-volatile User Data of each motor device (controller) with
the data expected by this library:
1. **User Data 0**: Device CRC Code. This variable should store the CRC32-XFER checksum of the device’s name
(user-defined name). During runtime, the library generates the CRC32-XFER checksum of each device’s name and compares
it against the value stored inside the User Data 0 variable to ensure that each device is configured appropriately to
work with the sl-experiment library. **Hint!** Use the `sl-crc` console command to generate the CRC32-XFER checksum
for each device during manual configuration, as it uses the same code as used during runtime and, therefore,
guarantees that the checksums will match.
2. **User Data 1**: Device ShutDown Flag. This variable is used as a secondary safety measure to ensure each device has
been properly shut down during previous runtimes. As part of the manual device configuration, make sure that this
variable is set to **1**. Otherwise, the library will not start any runtime that involves that Zaber motor. During
runtime, the library sets this variable to 0, making it impossible to use the motor without manual intervention again
if the runtime interrupts without executing the proper shut down sequence that sets the variable back to 1.
**Warning!** It is imperative to ensure that all motors are parked at positions where they are **guaranteed** to
successfully home after power cycling before setting this to 1. Otherwise, it is possible for some motors to collide
with other system components and damage the acquisition system.
3. **User Data 10**: Device Axis Type Flag. This variable should be set to **1** for motors that move on a linear axis
and **0** for motors that move on a rotary axis. This static flag is primarily used to support proper unit
conversions and motor positioning logic during runtimes.
4. **User Data 11**: Device Park Position. This variable should be set to the position, in native motor units, where
the device should be moved as part of the 'park' command and the shut-down sequence. This is used to position all
motors in a way that guarantees they can be safely 'homed' at the beginning of the next runtime. Therefore, each
park position has to be selected so that each motor can move to their 'home' sensor without colliding with any other
motor **simultaneously** moving towards their 'home' position. **Note!** The lick-port uses the 'park' position as
the **default** imaging position. During runtime, it will move to the 'park' position if it has no animal-specific
position to use during imaging. Therefore, make sure that the park position for the lick-port is always set so that
it cannot harm the animal mounted in the Mesoscope enclosure while moving to the park position from any other
position.
5. **User Data 12** Device Maintenance Position. This variable should be set to the position, in native motor units,
where the device should be moved as part of the 'maintain' command. Primarily, this position is used during water
delivery system calibration and the running-wheel surface maintenance. Typically, this position is calibrated to
provide easy access to all hardware components of the system by moving all motors as far away from each other as
reasonable.
6. **User Data 13**: Device Mount Position. This variable should be set to the position, in native motor units, where
the device should be moved as part of the 'mount' command. For the lick-port, this position is usually far away from
the animal, which facilitates mounting and unmounting the animal from the rig. For the head-bar and the wheel motor
groups, this position is used as the **default** imaging position. Therefore, set the head-bar and the wheel 'mount'
positions so that any (new) animal can be comfortably and safely mounted in the Mesoscope enclosure.
### Behavior Cameras
To record the animal’s behavior, the system uses a group of three cameras. The **face_camera** is a high-end
machine-vision camera used to record the animal’s face with approximately 3-MegaPixel resolution. The **left-camera**
and the **right_camera** are 1080P security cameras used to record the body of the animal. Only the data recorded by the
**face_camera** is currently used during data processing and analysis, but the data from all available cameras is saved
during acquisition. To interface with the cameras, the system leverages customized
[ataraxis-video-system](https://github.com/Sun-Lab-NBB/ataraxis-video-system) bindings.
Specific information about the cameras and related imaging hardware, as well as the snapshot of the configuration
parameters used by the **face_camera**, is available
[here]https://drive.google.com/drive/folders/1l9dLT2s1ysdA3lLpYfLT1gQlTXotq79l?usp=sharing).
### MicroControllers
To interface with all other hardware components **other** than cameras and Zaber motors, the Mesoscope-VR system uses
Teensy 4.1 microcontrollers with specialized
[ataraxis-micro-controller](https://github.com/Sun-Lab-NBB/ataraxis-micro-controller) code. Currently, The system
uses three isolated microcontroller subsystems: **Actor**, **Sensor**, and **Encoder**.
For instructions on assembling and wiring the electronic components used in each microcontroller system, as well as the
code running on each microcontroller, see the
[microcontroller repository](https://github.com/Sun-Lab-NBB/sl-micro-controllers).
### Virtual Reality Task Environment (Unity)
The task environment used in all Mesoscope-VR experiments is rendered and controlled by the Unity game engine. To make
Unity work with this library, each project-specific Unity task must use the bindings and assets released as part of the
[Unity-tasks repository](https://github.com/Sun-Lab-NBB/Unity-tasks). Follow the instructions from that repository to
set up Unity Game engine to interface with this library and to create new virtual task environments.
**Note!** This library does not contain tools to initialize Unity Game engine. The desired Virtual Reality task
has to be started ('armed') ***manually*** before entering the main runtime (data acquisition session) cycle. The main
Unity repository contains more details about starting the virtual reality tasks when running experiments. During
CLI-driven experiment runtimes, the library instructs the user when to 'arm' the Unity game engine.
### Google Sheets API Integration
This library interacts with the shared Google Sheet files used in the Sun lab to track and communicate certain
information about the animals that participate in all projects. Currently, this includes two files: the **surgery log**
and the **water restriction log**. Primarily, this integration is used to ensure that all information about each
experiment subject (animal) is stored in the same location (on the long-term storage machine(s)). Additionally, it is
used in the lab to automate certain data logging tasks.
#### Setting up Google Sheets API Access
**If you already have a service Google Sheets API account, skip to the next section.** Most lab members can safely
ignore this section, as all service accounts are managed at the acquisition-system level, rather than individual lab
members.
1. Log into the [Google Cloud Console](https://console.cloud.google.com/welcome).
2. Create a new project.
3. Navigate to APIs & Services → Library and enable the Google Sheets API for the project.
4. Under IAM & Admin → Service Accounts, create a service account. This will generate a service account ID in the format
of `your-service-account@gserviceaccount.com`.
5. Use Actions → Manage Keys and, if a key does not already exist, create a new key and download it in JSON format.
This key is then used to access the Google Sheets.
#### Adding Google Sheets Access to the Service Account
To access the **surgery log** and the **water restriction log** Google Sheets as part of this library runtime, create
and share these log files with the email of the service account created above. The service account requires **Editor**
access to both files.
**Note!** This feature requires that both log files are formatted according to the available Sun lab templates.
Otherwise, the parsing algorithm will not behave as expected, leading to runtime failures. Additionally, both log files
have to be pre-filled in advance, as the processing code is not allowed to automatically generate new table (log) rows.
**Hint!** Currently, it is advised to pre-fill the data a month in advance. Since most experiments last for at most a
month, this usually covers the entire experiment period for any animal.
### ScanImage PC Assets
As mentioned above, the ScanImagePC is largely assembled and configured by external contractors. However, the PC
requires additional assets and configuration post-assembly to make it compatible with sl-experiment-managed runtimes.
#### File System Access
To support the sl-experiment runtime, the ScanImagePC’s filesystem must be accessible to the **VRPC** via the Server
Message Block version 3 (SMB3) or equivalent protocol. Since ScanImagePC uses Windows, it is advised to use the SMB3
protocol, as all Windows machines support it natively with minimal configuration. As a minimum, the ScanImagePC must be
configured to share the root Mesoscope output folder with the VRPC over the local network. This is required to both
fetch the data acquired by the Mesoscope during preprocessing and to control the Mesoscope during runtime.
#### Default Screenshot Directory
During runtime, the sl-experiment library prompts the user to generate a screenshot of the ScanImagePC desktop and
place it in the network-shared mesoscope data folder (see above). The screenshot is used to store the information about
the red-dot alignment, the acquisition parameters, and the state of the imaging plane at the beginning of each session.
The library is statically configured to fetch the screenshot from the shared folder and will not look in any other
directories. Therefore, it is advised to reconfigure the default output directory used by the 'Win + PrntSc' command on
the ScanImagePC to save the screenshots into the shared Mesoscope output directory.
#### MATLAB Assets
ScanImage software is written in MATLAB and controls all aspects of Mesoscope data acquisition. While each runtime
requires the experimenter to manually interface with the ScanImage GUI during Mesoscope preparation, all data
acquisition runtimes using the sl-experiment library require the user to call the **setupAcquisition** MATLAB function
available from [mesoscope assets repository](https://github.com/Sun-Lab-NBB/sl-mesoscope-assets). This function carries
out multiple runtime-critical tasks, including setting up the acquisition, generating and applying the online motion
correction algorithm, and allowing the VRPC to control the Mesoscope via creating or removing binary marker files.
To configure MATLAB to access the mesoscope assets, git-clone the entire repository to the ScanImagePC. Then, follow the
tutorials [here](https://www.mathworks.com/help/matlab/matlab_env/add-remove-or-reorder-folders-on-the-search-path.html)
and add the path to the root mesoscope assets folder to MATLAB’s search path. MATLAB will then be able to use all
functions from that repository, including the setupAcquisition function. The repository also contains the online
motion estimation and correction assets developed in the [Pachitariu and Stringer lab](https://mouseland.github.io/),
which are required for the setupAcquisition function to work as expected.
### Mesoscope-VR Assembly
***This section is currently a placeholder. Since the final Mesoscope-VR system design is still a work in progress, it
will be populated once the final design implementation is constructed and tested in the lab.***
The Mesoscope-VR assembly mostly consists of two types of components. First, it includes custom components manufactured
via 3D-printing or machining (for metalwork). Second, it consists of generic components available from vendors such as
ThorLabs, which are altered in workshops to fit the specific requirements of the Mesoscope-VR system. The blueprints and
CAD files for all components of the Mesoscope-VR systems, including CAD renders of the assembled system, are available
[here](https://drive.google.com/drive/folders/1Oz2qWAg3HkMqw6VXKlY_c3clcz-rDBgi?usp=sharing).
___
## Acquired Data Structure and Management
The library defines a fixed structure for storing all acquired data which uses a 4-level directory tree hierarchy:
**root** (volume), **project**, **animal**, and **session**. This structure is reused by all acquisition systems, and
it is maintained across all long-term storage destinations. After each data acquisition runtime (session), all
raw data is stored under the **root/project/animal/session/raw_data** directory stored on one or more machines mentioned
below. After each data processing pipeline runtime, all processed data generated by that pipeline is stored under the
**root/project/animal/session/processed_data**.
Currently, each data acquisition system in the lab uses at least three machines:
1. The **main data acquisition PC** is used to acquire and preprocess the data. For example, the *VRPC* of the
*Mesoscope-VR* system is the main data acquisition PC for that system. This PC is used to both **acquire** the data
and, critically, to **preprocess** the data before it is moved to the long-term storage destinations.
2. The **BioHPC compute server** is the main long-term storage destination and the machine used to process and
analyze the data. This is a high-performance computing server owned by the lab that can be optionally extended by
renting additional nodes from Cornell’s BioHPC cluster. The BioHPC server stores both the raw data and the processed
data generated by all Sun lab processing pipelines.
3. The **Synology NAS** is the back-up 'cold' long-term storage destination. It only stores the raw data and is
physically located in a different building from the main BioHPC compute server to provide data storage redundancy. It
is only used to back up raw data and is generally not intended to be accessed unless the main data storage becomes
compromised for any reason.
***Critical!*** Each data acquisition system is designed to **mount the BioHPC and the NAS to the main acquisition PC
filesystem using the Server Message Block 3 (SMB3) protocol**. Therefore, each data acquisition system operates on the
assumption that all storage component filesystems are used contiguously and can be freely accessed by the main
acquisition PC OS.
***Note!*** The library tries to maintain at least two copies of data for long-term storage: one on the NAS and the
other on the BioHPC server. It is configured to purge redundant data from the data acquisition system machines if
the data has been **safely** moved to the long-term storage destinations. The integrity of the moved data is verified
using xxHash-128 checksum before the data is removed from the data-acquisition system.
### Root Directory (Volume)
All data acquisition systems, the Synology NAS and the BioHPC server keep **ALL** Sun lab projects in the same **root**
directory. The BioHPC server uses **two roots**, one for the raw data and the other for the processed data. This
separation is due to the BioHPC server using a combination of fast NVME drives and slow HDD drives to optimize storage
cost and data processing performance. The exact location and name of the root directory on each machine is arbitrary
but should generally remain fixed (unchanging) over the entire lifetime of that specific machine.
### Project Directory
When a new project is created, a **project** directory **named after the project** is created under the **root**
directory of the main data acquisition machine, the Synology NAS, and both the raw and processed data BioHPC volumes.
Depending on the host machine, this project directory may contain further subdirectories.
All data acquisition systems also create a **configuration** subdirectory under the root project directory. This
directory stores all supported experiment configurations for the project. Each 'sl-experiment' CLI command call
searches the configuration directory for the .yaml file with the name of the target experiment to load the experiment
data.
### Animal Directory
When the library is used to acquire data for a new animal, it generates a new **animal** directory under the **root**
and **project** directory combination. The directory uses the ID of the animal, as its name. Depending on the host
machine, this animal directory may contain further subdirectories.
All data acquisition systems also create a **persistent_data** subdirectory under the root animal directory, which is
used to store data that is reused between data acquisition sessions.
***Critical!*** The current Sun lab convention stipulates that all animal IDs should be numeric. While some library
components do accept strings as inputs, it is expected that all animal IDs only consist of positive integers. Failure to
adhere to this naming convention can lead to runtime errors and unexpected behavior of all library components!
### Session Directory
Each time the library is used to acquire data, a new session directory is created under the **root**, **project** and
**animal** directory combination. The session name is derived from the current ***UTC*** timestamp at the time of the
session directory creation, accurate to ***microseconds***. Primarily, this naming format was chosen to make all
sessions acquired by the same acquisition system have unique and chronologically sortable names. The session name format
follows the order of **YYYY-MM-DD-HH-MM-SS-US**.
### Raw Data and Processed Data:
All data acquired by this library is stored under the **raw_data** subdirectory, generated for each session. Overall,
an example path to the acquired (raw) data can therefore look like this:
`/media/Data/Experiments/Template/666/2025-11-11-05-03-234123/raw_data/`.
Similarly, all Sun lab data processing pipelines generate new files and subdirectories under the **processed_data**
subdirectory for each session. An example path to the processed_data directory can therefore look like this:
`/server/sun_data/Template/666/2025-11-11-05-03-234123/processed_data`.
***Note!*** This library treats **both** newly acquired and preprocessed data as **raw**. This is because preprocessing
**does not change the content of the data**. Instead, preprocessing uses lossless compression to more efficiently
package the data for transmission and can at any time be converted back to the original format. Processing the data, on
the other hand, generates additional data and / or modifies the processed data values in ways that may not necessarily
be reversible.
### Shared Raw Data
The section below briefly lists the data acquired by **all** Sun lab data acquisition systems. Note, each acquisition
system also generates **system-specific** data, which is listed under acquisition-system-specific sections available
after this section.
**Note!** For information about the **processed** data, see the
[main data processing library](https://github.com/Sun-Lab-NBB/sl-forgery).
After acquisition and preprocessing, the **raw_data** folder of each acquisition system will, as a minimum, contain the
following files and subdirectories:
1. **ax_checksum.txt**: Stores the xxHash-128 checksum used to verify data integrity when it is transferred to the
long-term storage destination. The checksum is generated before the data leaves the main data acquisition system PC
and, therefore, accurately captures the final state of the raw data before it enters storage.
2. **hardware_state.yaml**: Stores the snapshot of the dynamically calculated parameters used by the data acquisition
system modules during runtime. These parameters are recalculated at the beginning of each data acquisition
session and are rounded and stored using the appropriate floating point type (usually fp64) to minimize floating
point rounding errors. This improves the quality of processed data by ensuring that the processing and the
acquisition pipelines use the same floating-point parameter values. This file is also used to determine which modules
were used during runtime and, consequently, which data can be parsed from the .npz log files generated at runtime
(see below).
3. **session_data.yaml**: Stores information necessary to maintain the same session data structure across all machines
used during data acquisition and long-term storage. This file is used by all Sun lab libraries as an entry point for
working with session’s data. The file also includes all available information about the identity and purpose of the
session and can be used by human experimenters to identify the session. Since version 3.0.0, the file also stores
the version of the sl-experiment and Python that were used to acquire the data.
4. **session_descriptor.yaml**: Stores session-type-specific information, such as the task parameters and experimenter
notes. The contents of the file are overall different for each session type, although some fields are reused by all
sessions. The contents for this file are partially written by the library (automatically) and, partially, by the
experimenter (manually, at the end of each session). At the end of each runtime, a copy of the descriptor file is
cached inside the *persistent_data* directory of the animal, replacing any already existing copy. This is used to
optionally restore certain runtime configuration parameters between session types that support this functionality.
5. **surgery_metadata.yaml**: Stores the data on the surgical intervention(s) performed on the animal that participated
in the session. This data is extracted from the **surgery log** Google Sheet and, for most animals, should be the
same across all sessions.
6. **system_configuration.yaml**: Stores the configuration parameters of the data acquisition system that generated the
session data. This is a snapshot of **all** dynamically addressable configuration parameters used by the system.
When combined with the assembly instructions and the appropriate sl-experiment library version, it allows completely
replicating the data acquisition system used to acquire the session data.
7. **behavior_data**: Stores compressed .npz log files that contain all non-video behavior data acquired by the system.
This includes all messages sent or received by each microcontroller, the timestamps for the frames acquired by
each camera and (if applicable) the main brain activity recording device (e.g.: Mesoscope). These logs also include
session metadata, such as trials, task conditions, and system and runtime state transitions. Although the exact
content of the behavior data folder can differ between acquisition systems, all systems used in the lab generate some
form of non-video behavior data.
8. **camera_data**: Stores the behavior videos recorded by video cameras used by the acquisition system. While not
technically required for every system, all current Sun lab data acquisition systems use one or more cameras to
acquire behavior videos.
9. **experiment_configuration.yaml**: This file is only created for **experiment** sessions. It stores the
configuration of the experiment task performed by the animal during runtime. The contents of the file differ for each
data acquisition system, but each system generates a version of this file. The file contains enough information to
fully replicate the experiment runtime on the same acquisition system and to process and analyze the acquired data.
10. **telomere.bin**: This marker is used to communicate whether the session data is **complete**. Incomplete sessions
appear due to session acquisition runtime being unexpectedly interrupted. This is rare and is typically the result of
major emergencies, such as sudden power loss or other unforeseen events. Incomplete sessions are automatically
excluded from automated data processing and require manual user intervention to assess the usability of the session
for analysis. This marker file is created **exclusively** as part of session data preprocessing, based on the value
of the **session_descriptor.yaml** file 'incomplete' field.
11. **integrity_verification_tracker.yaml**: This tracker file is used internally to run and track the outcome of the
remote data verification procedure. This procedure runs as part of moving the data to the long-term storage
destinations to ensure the data is transferred intact. Users can optionally check the status of the verification by
accessing the data stored inside the file. **Note!** This file is added **only!** to the raw_data folder stored on
the BioHPC server.
### Shared Temporary Data
The sl-experiment library additionally uses the following temporary marker files and directories which are cleared
before the raw data is transmitted to the long-term storage destinations:
1. **nk.bin**: This marker is automatically cached to disk as part of creating a new session data hierarchy. Each
runtime removes this marker file when it successfully completes its runtime preparation (the main start() method call
of the runtime management class). If this marker exists when the runtime enters the shutdown cycle, this indicates
that the runtime encountered a fatal error during startup and had to be terminated early. In this case, the session’s
data is silently deleted, as uninitialized runtime necessarily does not contain any valid data. This is used to
automatically declutter the data acquisition and long-term storage PCs to only keep valid sessions.
2. **behavior_data_log**: All behavior log entries are initially saved as individual .npy files. Each .npy file stores
a serialized log message in the form of an uint8 (byte) NumPy array. Since messages are cached to disk as soon as
they are received by the DataLogger to minimize data loss in case of emergency shutdowns, the temporary
behavior_data_log directory is used to store these messages during runtime. Frequently, the directory accumulates
millions of .npy files at runtime, making it challenging for human operators to work with the data. During
preprocessing, individual .npy files are grouped by their source (what made the log entry, e.g.: VideoSystem,
MicroController, Data Acquisition System, etc.) and are compressed into .npz archives, one for each source. The
.npz archives are then moved to the *behavior_Data* folder, and the *behavior_data_log* with the individual
.npy files is deleted to conserve disk space.
3. **ubiquitin.bin**: This marker file is created during *preprocessing* runtime to mark directories that
are no longer needed for deletion. Specifically, when preprocessing safely moves the acquired data to long-term
storage destinations, it marks all now-redundant raw data from all acquisition system machines (PCs) for removal to
optimize disk usage. At the end of preprocessing, a specialized *purge* runtime is executed to discover and remove
all directories marked for deletion with the ubiquitin.bin marker files.
### Mesoscope-VR System Data
The Mesoscope-VR system instantiates a directory hierarchy both on the VRPC and the ScanImagePC. Below is the list of
files and directories found on each of these machines.
#### Raw Data
The Mesoscope-VR system generates the following files and directories, in addition to those discussed in the shared
raw data section on the VRPC:
1. **mesoscope_data**: Stores all Mesoscope-acquired data (frames, motion estimation files, etc.). Since Mesoscope data
is only acquired for **experiment** sessions, this directory is kept empty for all other session types. During
preprocessing, the folder contents are organized in a way to automatically work with
[sl-suite2p](https://github.com/Sun-Lab-NBB/suite2p) single-day processing pipeline. All Mesoscope data is intended
to be processed with the sl-suite2p library.
2. **zaber_positions.yaml**: Stores the snapshot of the positions used by the HeadBar, LickPort, and Wheel Zaber motor
groups, taken at the end of the session’s data acquisition. All positions are stored in native motor units. This
file is created for all session types supported by the Mesoscope-VR system. As a backup, a copy of this file is also
generated at the beginning of each session runtime. This allows recovering from critical runtime failures, where the
runtime may not be able to generate this snapshot. During both snapshot generation timepoints, a copy of the
generated snapshot file is also cached inside the *persistent_data* directory of the animal to support restoring the
motors to the same position during the next session.
3. **mesoscope_positions.yaml**: Stores the snapshot of the physical Mesoscope objective position in X, Y, Z, and Roll
axes, the virtual ScanImage axes (Fast Z, Tip, Tilt), and the laser power at the sample, taken at the end of the
session’s data acquisition. **Note!** This file relies on the experimenter updating the stored positions if they are
changed during runtime. It is only created for window checking and experiment sessions. A copy of this snapshot file
is also saved to the *persistent_data* directory of the animal to support restoring the Mesoscope to the same imaging
field during the next session.
4. **window_screenshot.png**: Stores the screenshot of the ScanImagePC screen. The screenshot should contain the image
of the red-dot alignment, the view of the target cell layer, the Mesoscope position information, and the data
acquisition parameters. Primarily, the screenshot is used by experimenters to quickly reference the imaging quality
from each experiment session. This file is only created for window checking and experiment sessions. A copy of this
file is saved to the *persistent_data* directory of the animal to help the user to realign the red-dot to a similar
position during the next session.
### ScanImagePC
All Mesoscope-VR system data on the ScanImagePC is stored under the user-defined ScanImagePC root directory, which is
expected to be mounted to the VRPC via the SMB or similar protocol. Under that root directory, the system creates the
following directories and files:
1. **mesoscope_data**: This directory stores all Mesoscope-acquired data for the currently running session. The
*setupAcquisition* MATLAB function configures ScanImage software to output all data to the mesoscope_data directory,
which is shared by all sessions, animals, and projects. This allows using the same static output path for all
ScanImage acquisitions.
2. **session-specific mesoscope_data**: At the end of each runtime, the Mesoscope-VR system ***renames*** the
mesoscope_data directory to include the session name (id). Then, it generates an empty mesoscope_data directory for
the next runtime. This way, all data of each completed session is stored under a unique directory named after that
session. This step is crucial for data preprocessing, which identifies the session data directory and pulls it over
to the VRPC based on the session name (id).
3. **persistent_data**. This directory is created for each unique **project** and **animal** combination,
similar to the data structure created by sl-experiment on the main acquisition system PC. This directory contains
the **first day** MotionEstimator.me and fov.roi files. These files are typically reused by all following data
acquisition sessions to restore the imaging field to the same location as used on the first day. The full path to
the persistent_data directory would typically look like **root/project/animal/persistent_data**.
#### Mesoscope-VR Temporary Data
The Mesoscope-VR system also generates the following temporary files and directories during runtime:
1. **raw_mesoscope_data**: Stores uncompressed .TIFF stacks fetched by the VRPC from the ScanImagePC. This is done
as part of data preprocessing to collect all data on the VRPC before executing individual preprocessing subroutines.
The .TIFF stacks are then re-compressed using the Limited Error Raster Compression (LERC) scheme and are saved to the
*mesoscope_data* folder. Once this process completes successfully, the *raw_mesoscope_data* with all raw files is
deleted to conserve disk space.
2. **kinase.bin**: This marker is created in the *mesoscope_data* ScanImagePC directory. During runtime, the
*setupAcquisition* MATLAB function monitors the *mesoscope_data* directory for the presence of this marker file. If
the file is present, the function triggers the Mesoscope data acquisition. If the file is absent, the function stops
the Mesoscope data acquisition until the file is re-created. As such, the VRPC uses this marker file to start
and stop the Mesoscope data acquisition during normal operation.
3. **phosphatase.bin**: This marker works similar to the *kinase.bin* marker, but is used by the VRPC to abort the
ScanImagePC runtime at any point. When ScanImagePC is waiting for the *kinase.bin* marker to be created for the first
time, stopping the Mesoscope acquisition necessarily requires the kinase marker to be first created and then removed,
triggering a brief Mesoscope frame acquisition. Creating the *phosphatase.bin* marker instead triggers the
ScanImagePC to end the runtime without waiting for the *kinase.bin* marker, effectively aborting the runtime without
acquiring any frames.
---
## Acquiring Data in the Sun Lab
All user-facing library functionality is realized through a set of Command Line Interface (CLI) commands automatically
exposed when the library is pip-installed into a python environment. Some of these commands take additional arguments
that allow further configuring their runtime. Use `--help` argument when calling any of the commands described below to
see the list of supported arguments together with their descriptions and default values.
To use any of the commands described below, activate the python environment where the libray is installed, e.g., with
`conda activate MYENV` and type one of the commands described below.
***Warning!*** Most commands described below use the terminal to communicate important runtime information to the user
or request user feedback. **Make sure you carefully read every message printed to the terminal during runtime**.
Failure to do so may damage the equipment or harm the animal!
### Step 0: Configuring the Data Acquisition System
Before acquiring data, each acquisition system has to be configured. This step is done in addition to assembling
the system and installing the required hardware components. Typically, this only needs to be done when the acquisition
system configuration or hardware changes, so most lab members can safely skip this step.
Use `sl-create-system-config` command to generate the system configuration file. As part of its runtime, the command
configures the host machine to remember the path to the generated configuration file, so all future sl-experiment
runtimes on that machine will automatically load and use the appropriate acquisition-system configuration parameters.
***Note!*** Each acquisition system uses unique configuration parameters. Additionally, the sl-experiment library always
assumes that any machine (PC) can only be used by a single data-acquisition system (is permanently a part of that
acquisition system). Only the **main** PC of the data acquisition system (e.g.: the VRPC of the Mesoscope-VR system)
that runs the sl-experiment library should be configured via this command.
For information about the available system configuration parameters, read the *API documentation* of the appropriate
data-acquisition system available from the [sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library.
Specifically, all data acquisition system configuration parameters are defined in the *SystemConfiguration* class named
after that system.
Additionally, since every data acquisition system requires access to the BioHPC server for long-term data storage, it
needs to be provided with server access credentials. The credentials are stored inside the 'server_credentials.yaml'
file, which is generated by using the `sl-create-server-credentials` command. **Note!** The path to the generated and
filled credential file has to be provided to the acquisition system before runtime by editing the acquisition-system
configuration file created via the command discussed above.
### Step 1: Creating a Project
All data acquisition sessions require a valid project to run. To create a new project, use the `sl-create-project`
command. This command can only be called on the main PC of a properly configured data-acquisition system (see Step 0
above). As part of its runtime, this command generates the root project directory on all machines that make up the
data acquisition systems and long-term storage destinations.
### Step 2: Creating an Experiment
All projects that involve scientific experiments also need to define at least one **experiment configuration**.
Experiment configurations are unique for each data acquisition system and are stored inside .yaml files named after the
experiment. To generate a new experiment configuration file, use the `sl-create-experiment` command. This command
generates a **precursor** experiment configuration file inside the **configuration** subdirectory, stored under the root
project directory on the main PC of the data acquisition system.
For information about the available experiment configuration parameters in the precursor file, read the
*API documentation* of the appropriate data-acquisition system available from the
[sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library.
### Step 3: Maintaining the Acquisition System
All acquisition systems contain modules that require frequent maintenance. Most of these modules are unique to each
acquisition system. Therefore, this section is further broken into acquisition-system-specific subsections.
#### Mesoscope-VR
The Mesoscope-VR system contains two modules that require frequent maintenance: the **water delivery system** and the
**running wheel**. To facilitate the maintenance of these modules, the sl-experiment library exposes the `sl-maintain`
command.
This command is typically called at least twice during each day the system is used to acquire data. First, it is used
at the beginning of the day to prepare the Mesoscope-VR system for runtime by filling the water delivery system. Second,
it is used at the end of each day to empty the water delivery system.
Less frequently, this command is used to re-calibrate the water delivery system, typically, as a result of replacing
system components, such as tubing or the valve itself. The command is also occasionally used to replace the
surface material of the running wheel, which slowly deteriorates as the wheel is used.
This command can also facilitate cleaning the wheel, which is typically done before and after each runtime to remove
any biological contaminants left by each animal participating in experiment or training runtimes.
***Note!*** Since this runtime fulfills multiple functions, it uses an 'input'-based terminal interface to accept
further commands during runtime. To prevent visual bugs, the input does not print anything to the terminal and appears
as a blank new line. If you see a blank new line with no terminal activity, this indicates that the system is ready
to accept one of the supported commands. All supported commands are printed to the terminal as part of the runtime
initialization.
Supported vr-maintenance commands:
1. `open`. Opens the water delivery valve.
2. `close`. Closes the water delivery valve.
3. `close_10`. Closes the water delivery valve after a 10-second delay.
4. `reference`. Triggers 200 valve pulses with each pulse calibrated to deliver 5 uL of water. This command is used to
check whether the valve calibration data stored in the system_configuration.yaml file is accurate. This should
be done at the beginning of each day the system is used to acquire data. The reference runtime should overall
dispense ~ 1 ml of water. If the reference procedure does not dispense the expected volume of water, the system
needs to be recalibrated.
5. `calibrate_15`. Runs 200 valve pulses, keeping the valve open for 15 milliseconds at each pulse. This is used to
generate valve calibration data.
6. `calibarte_30`. Same as above, but uses 30-millisecond pulses.
7. `calibrate_45`. Same as above, but uses 45-millisecond pulses.
8. `calibrate_60`. Same as above, but uses 60-millisecond pulses.
9. `lock`. Locks the running wheel (engages running-wheel break).
10. `unlock`. Unlocks the running wheel (disengages running wheel break).
### Step 4: Acquiring Data
Each acquisition system supports one or more distinct types of data-acquisition sessions (runtimes). As a minimum, this
includes an 'experiment' session type, which is the primary use case for all acquisition systems in the lab. Some
systems may also support one or more training session types, which often do not acquire any brain activity data, but
otherwise behave similar to experiment sessions.
#### Mesoscope-VR
The Mesoscope-VR system supports four types of runtime sessions, each associated with a unique runtime command exposed
by the sl-experiment library:
1. **Window Checking**: This session is executed by calling the `sl-check-window` command. It guides the user through
finding the imaging plane and generating the reference MotionEstimator.me and zstack.tiff files for the checked
animal. This session is typically used ~2–3 weeks after the surgical intervention and before any training or experiment
sessions to assess the quality of the intervention and the suitability of including the animal in experiment cohorts.
2. **Lick Training**: This session is executed by calling the `sl-lick-train` command. All animals that participate in
Mesoscope-VR experiments undergo a two-stage training protocol, with lick training being the first stage. During this
runtime, the animals are head fixed in the Mesoscope enclosure for ~ 20 minutes. The primary goal of the runtime is
to teach the animals to lick at the water tube to consume water rewards and associate the sound tone emitted at reward
delivery with water coming out of the tube. During runtime, the running wheel is locked, so the animals cannot run.
3. **Run Training**: This session is executed by calling the `sl-run-train` command. This is the second stage of the
mandatory two-stage Mesoscope-VR training protocol. During this runtime, the animals are head fixed in the Mesoscope
enclosure for ~ 40 minutes. The primary goal of the runtime is to teach the animals to run on the wheel while head-fixed
and to associate running with receiving water rewards. During runtime, the running wheel is unlocked, but the Virtual
Reality screens are kept off, ensuring that the animal is not exposed to any visual cues until the first experiment day.
4. **Mesoscope Experiment**: This session is executed by calling the `sl-experiment` command. This type of session is
designed to execute the experiment specified in the target *experiment_configuration.yaml* file (see above). The system
supports varying experiment configurations and Virtual Reality environments, offering experimenters the flexibility to
run different projects using the same set of APIs and system hardware.
### Step 5: Preprocessing and Managing Data
All acquisition systems support two major ways of handling the session data acquired at runtime. For most runtimes, the
choice of how to handle the data is made as part of the acquisition system shutdown sequence. However, in the case of
unexpected runtime terminations, all data preprocessing steps can also be executed manually by calling the appropriate
CLI command.
The first and most commonly used way is to **preprocess** the acquired data. This can be done manually by calling the
`sl-preprocess` command. Preprocessing consists of two major steps. The first step pulls all available data to the
main data acquisition system machine (PC) and re-packages (re-compresses) the data to reduce its size without loss. The
second step distributes (pushes) the data to **multiple** long-term storage destinations, such as the NAS and the
BioHPC server.
**Critical!** It is imperative that **all** valid data acquired in the lab undergoes preprocessing
**as soon as possible**. Only preprocessed data is stored in a way that maximizes its safety by using both
redundancy and parity. Data that is not preprocessed may be **lost** in the case of emergency, which is considerably
less likely for the preprocessed data.
When preprocessed data is successfully and safely pushed to long-term storage destinations, the preprocessing runtime
marks all raw data directories for that session stored on any acquisition system machine for deletion by creating the
*ubiquitin.bin* marker files. It then calls a function that finds and removes marked directories for **all available
projects and animals**. This final step can be triggered manually by calling the `sl-purge` CLI command. To optimize
disk space usage on acquisition system machines, it is recommended to call this command at least daily.
The second way of managing the data is primarily used during testing and when handling interrupted sessions that did
not generate any valid data. This involves removing all session data from **both** the data acquisition system and all
long-term storage destinations. This runtime is extremely dangerous and, if not used carefully, can
***permanently delete valid data***. This processing mode can be triggered using the `sl-delete-session-data` CLI
command, although it is not recommended for most users.
---
## API Documentation
See the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the
detailed description of the methods and classes exposed by components of this library, as well as all available
CLI commands with their arguments
___
## Recovering from Interruptions
While it is not typical for the data acquisition or preprocessing pipelines to fail during runtime, it is possible. The
library can recover or gracefully terminate the runtime for most code-generated errors, so this is usually not a
concern. However, if a major interruption (i.e., power outage) occurs or one of the hardware assets malfunctions during
runtime, manual intervention is typically required to recover the session data and reset the acquisition system.
### Data acquisition interruption
Data acquisition can be interrupted in two main ways, the first being due to an external asset failure, for example, if
the ScanImagePC unexpectedly shuts down during Mesoscope-VR system runtime. In this case, the runtime pauses and
instructs the user to troubleshoot the issue and then resume the runtime. This type of *soft* interruption is handled
gracefully during runtime, data processing, and analysis to exclude the data collected during the interruption from the
output dataset. Generally, soft interruptions are supported for most external assets, which includes anything not
managed directly by the sl-experiment library and the main data acquisition system PC. While inconvenient, these
interruptions do not typically require specialized handling other than recovering and restoring the failed asset.
**Note!** While most soft interruptions typically entail resuming the interrupted runtime, it is also possible to
instead terminate the runtime. To do so, execute the `termiante` command via the GUI instead of trying to resume
the runtime. In this case, the system attempts to execute a graceful shutdown procedure, saving all valid data in the
process.
The second way involves interruption due to sl-experiment runtime failures or unexpected shut-downs of the main
acquisition system PC. In these cases, manual user intervention is typically required to recover the useful data and
reset the system before the acquisition can be restarted. The handling of such cases often consists of specific steps
or each supported acquisition system. Typically, these *hard* interruptions are related to major issues, such as
global facility power loss or severe malfunction of sensitive acquisition system components, such as microcontrollers
and communication cables.
#### Mesoscope-VR
If the VRPC runtime unexpectedly interrupts at any point without executing the graceful shutdown, follow these
instructions:
1. If the session involved Mesoscope imaging, shut down the Mesoscope acquisition process and make sure all required
files (frame stacks, motion estimator data, cranial window screenshot) have been generated and saved to the
**mesoscope_data** folder.
2. If necessary, **manually** edit the session_descriptor.yaml, the mesoscope_positions.yaml, and the
zaber_positions.yaml files to include actual runtime information. Estimate the volume of water delivered at runtime
by manually reading the water tank level gauge.
3. Remove the animal from the Mesoscope enclosure. If necessary, use the *Zaber Launcher* app to directly interface with
Zaber motors and move them in a way that allows the animal to be recovered from the enclosure.
4. Use Zaber Launcher to **manually move the HeadBarRoll axis to have a positive angle** (> 0 degrees). This is
critical! If this is not done, the motor will not be able to home during the next session and will instead collide
with the movement guard, at best damaging the motor and, at worst, the Mesoscope.
5. Go into the 'Device Settings' tab of the Zaber Launcher, click on each Device tab (NOT motor!) and navigate to its
User Data section. Then **flip Setting 1 from 0 to 1**. Without this, the library will refuse to operate the Zaber
Motors during all future runtimes.
6. If the session involved Mesoscope imaging, **rename the mesoscope_data folder to prepend the session name, using an
underscore to separate the folder name from the session name**. For example, from mesoscope_data →
2025-11-11-05-03-234123_mesoscope_data. **Critical!** if this is not done, the library may **delete** any leftover
Mesoscope files during the next runtime and will not be able to properly preprocess the frames for the interrupted
session during the next step.
7. Call the `sl-process` command and provide it with the path to the session directory of the interrupted session. This
will preprocess and transfer all collected data to the long-term storage destinations. This way, you can preserve
any data acquired before the interruption and prepare the system for running the next session.
### Data preprocessing interruption
To recover from an error encountered during preprocessing, call the `sl-process` command and provide it with the path
to the session directory of the interrupted session. The preprocessing pipeline should automatically resume an
interrupted runtime from the nearest checkpoint.
---
## Versioning
This project uses [semantic versioning](https://semver.org/). For the versions available, see the
[tags on this repository](https://github.com/Sun-Lab-NBB/sl-experiment/tags).
---
## Authors
- Ivan Kondratyev ([Inkaros](https://github.com/Inkaros))
- Kushaan Gupta ([kushaangupta](https://github.com/kushaangupta))
- Natalie Yeung
- Katlynn Ryu ([katlynn-ryu](https://github.com/KatlynnRyu))
- Jasmine Si
___
## License
This project is licensed under the GPL3 License: see the [LICENSE](LICENSE) file for details.
___
## Acknowledgments
- All Sun lab [members](https://neuroai.github.io/sunlab/people) for providing the inspiration and comments during the
development of this library.
- The creators of all other projects used in our development automation pipelines and source code
[see pyproject.toml](pyproject.toml).
---
Raw data
{
"_id": null,
"home_page": null,
"name": "sl-experiment",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.12,>=3.11",
"maintainer_email": "Ivan Kondratyev <ik278@cornell.edu>",
"keywords": "acquisition, ataraxis, data, experiment, interface, mesoscope, sunlab",
"author": "Ivan Kondratyev, Kushaan Gupta, Natalie Yeung, Katlynn Ryu, Jasmine Si",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/f1/09/c267976daa41e0034f66e62da6174d7be5c65c6237a6e3b6b762d1df0bfe/sl_experiment-3.0.0.tar.gz",
"platform": null,
"description": "# sl-experiment\n\nA Python library that provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab.\n\n\n\n[](https://github.com/astral-sh/uv)\n[](https://github.com/astral-sh/ruff)\n\n\n\n\n\n___\n\n## Detailed Description\n\nThis library functions as the central hub for collecting and preprocessing the data for all current and future projects \nin the Sun lab. To do so, it exposes the API to interface with all data acquisition systems in the lab. Primarily, this \nrelies on specializing various general-purpose libraries released as part of the 'Ataraxis' science-automation project\nto work within the specific hardware implementations available in the lab.\n\nThis library is explicitly designed to work with the specific hardware and data handling strategies used in the Sun lab \nand will likely not work in other contexts without extensive modification. The library broadly consists of two \nparts: the shared assets and the acquisition-system-specific bindings. The shared assets are reused by all acquisition \nsystems and are mostly inherited from the [sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library. \nThe acquisition-system-specific code is tightly integrated with the hardware used in the lab and is generally not \ndesigned to be reused in any other context. See the [data acquisition systems](#data-acquisition-systems) section for \nmore details on currently supported acquisition systems.\n\n___\n\n## Table of Contents\n- [Installation](#installation)\n- [Data Acquisition Systems](#data-acquisition-systems)\n- [Mesoscope-VR System](#Mesoscope-vr-data-acquisition-system)\n- [Acquired Data Structure and Management](#acquired-data-structure-and-management)\n- [Acquiring Data in the Sun Lab](#acquiring-data-in-the-sun-lab)\n- [API Documentation](#api-documentation)\n- [Recovering from Interruptions](#recovering-from-interruptions)\n- [Versioning](#versioning)\n- [Authors](#authors)\n- [License](#license)\n- [Acknowledgements](#Acknowledgments)\n\n---\n\n## Installation\n\n### Source\n\nNote, installation from source is ***highly discouraged*** for anyone who is not an active project developer.\n\n1. Download this repository to your local machine using your preferred method, such as Git-cloning. Use one\n of the stable releases from [GitHub](https://github.com/Sun-Lab-NBB/sl-experiment/releases).\n2. Unpack the downloaded zip and note the path to the binary wheel (`.whl`) file contained in the archive.\n3. Run ```python -m pip install WHEEL_PATH```, replacing 'WHEEL_PATH' with the path to the wheel file, to install the \n wheel into the active python environment.\n\n### pip\nUse the following command to install the library using pip: ```pip install sl-experiment```.\n\n___\n\n## Data Acquisition Systems\n\nA data acquisition (and runtime control) system can be broadly defined as a collection of hardware and software tools \nused to conduct training or experiment sessions that acquire scientific data. Each data acquisition system can use one \nor more machines (PCs) to acquire the data, with this library (sl-experiment) typically running on the **main** data \nacquisition machine. Additionally, each system typically uses a Network Attached Storage (NAS), a remote storage server,\nor both to store the data after the acquisition safely (with redundancy and parity).\n\nIn the Sun lab, each data acquisition system is built around the main tool used to acquire the brain activity data. For\nexample, the main system in the Sun lab is the [Mesoscope-VR](#Mesoscope-vr-data-acquisition-system) system, which uses \nthe [2-Photon Random Access Mesoscope (2P-RAM)](https://elifesciences.org/articles/14472). All other components of that \nsystem are built around the Mesoscope to facilitate the acquisition of the brain activity data. Due to this inherent \nspecialization, each data acquisition system in the lab is treated as an independent unit that requires custom software\nto acquire, preprocess, and process the resultant data.\n\n***Note!*** Since each data acquisition system is unique, the section below will be iteratively expanded to include \nsystem-specific assembly instructions for **each supported acquisition system**. Commonly, updates to this section \ncoincide with major or minor library version updates.\n\n---\n\n## Mesoscope-VR Data Acquisition System\n\nThis is the main data acquisition system currently used in the Sun lab. The system broadly consists of four major \nparts: \n1. The [2-Photon Random Access Mesoscope (2P-RAM)](https://elifesciences.org/articles/14472), assembled by \n [Thor Labs](https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=10646) and controlled by \n [ScanImage](https://www.mbfbioscience.com/products/scanimage/) software. The Mesoscope control and data acquisition \n are performed by a dedicated computer referred to as the **'ScanImagePC'** or, (less frequently) the\n **'Mesoscope PC'**. This PC is assembled and configured by the [MBF Bioscience](https://www.mbfbioscience.com/). The\n only modification carried out by the Sun lab during assembly was the configuration of a Server Message Block (SMB)\n protocol access to the root folder used by the ScanImage software to save the Mesoscope data.\n2. The [Unity game engine](https://unity.com/products/unity-engine) running the Virtual Reality game world used in all \n experiments to control the task environment and resolve the task logic. The virtual environment runs on the main data\n acquisition computer referred to as the **'VRPC'** and relies on the [MQTT](https://mqtt.org/) communication protocol\n and the [Sun lab implementation of the GIMBL package](https://github.com/Sun-Lab-NBB/Unity-tasks) to bidirectionally \n interface with the virtual task environment.\n3. The [microcontroller-powered hardware](https://github.com/Sun-Lab-NBB/sl-micro-controllers) that allows the animal \n to bidirectionally interface with various physical components (modules) of the Mesoscope-VR systems.\n4. A set of visual and IR-range cameras, used to acquire behavior video data.\n\n### Main Dependency\n- ***Linux*** operating system. While the library *may* also work on Windows and (less likely) macOS, it has been \n explicitly written for and tested on the mainline [6.11 kernel](https://kernelnewbies.org/Linux_6.11) and \n Ubuntu 24.10 distribution of the GNU Linux operating system using [Wayland](https://wayland.freedesktop.org/) window\n system architecture.\n\n### Software Dependencies\n***Note!*** This list only includes *external dependencies*, which are installed *in addition* to all \ndependencies automatically installed from pip / conda as part of library installation. The dependencies below have to\nbe installed and configured on the **VRPC** before calling runtime commands via the command-line interface (CLI) exposed\nby this library.\n\n- [MQTT broker](https://mosquitto.org/) version **2.0.21**. The broker should be running locally and can use \n the **default** IP (27.0.0.1) and Port (1883) configuration.\n- [FFMPEG](https://www.ffmpeg.org/download.html). As a minimum, the version of FFMPEG should support H265 and H264 \n codecs with hardware acceleration (Nvidia GPU). This library was tested with the version **7.1.1-1ubuntu1.1**.\n- [MvImpactAcquire](https://assets-2.balluff.com/mvIMPACT_Acquire/). This library is tested with version **2.9.2**, \n which is freely distributed. Higher GenTL producer versions will likely work too, but they require purchasing a \n license.\n- [Zaber Launcher](https://software.zaber.com/zaber-launcher/download) version **2025.6.2-1**.\n- [Unity Game Engine](https://unity.com/products/unity-engine) version **6000.1.12f1**.\n\n### Hardware Dependencies\n\n**Note!** These dependencies only apply to the **VRPC**. Hardware dependencies for the **ScanImagePC** are determined \nand controlled by MBF and ThorLabs. This library benefits from the **ScanImagePC** being outfitted with a 10-GB network \ncard, but this is not a strict requirement. \n\n- [Nvidia GPU](https://www.nvidia.com/en-us/). This library uses GPU hardware acceleration to encode acquired video \n data. Any Nvidia GPU with hardware encoding chip(s) should work as expected. The library was tested with \n [RTX 4090](https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/).\n- A CPU with at least 12, preferably 16, physical cores. This library has been tested with \n [AMD Ryzen 7950X CPU](https://www.amd.com/en/products/processors/desktops/ryzen/7000-series/amd-ryzen-9-7950x.html). \n It is recommended to use CPUs with 'full' cores, instead of the modern Intel\u2019s design of 'e' and 'p' cores \n for predictable performance of all library components.\n- A 10-Gigabit capable motherboard or Ethernet adapter, such as [X550-T2](https://shorturl.at/fLLe9). Primarily, this is\n required for the high-quality machine vision camera used to record videos of the animal\u2019s face. The 10-Gigabit lines\n are also used for transferring the data between the PCs used in the data acquisition process and destination machines \n used for long-term data storage (see [acquired data management section](#acquired-data-structure-and-management) for \n more details).\n\n### System Assembly\n\nThe Mesoscope-VR system consists of multiple interdependent components. We are constantly making minor changes to the \nsystem to optimize its performance and facilitate novel experiments and projects carried out in the lab. Treat this \nsection as a general system composition guide, but consult lab publications over this section for instructions on \nbuilding specific system implementations used to acquire the data featured in different publications.\n\nPhysical assembly and mounting of ***all*** hardware components mentioned in the specific subsections below is discussed\nin the [main Mesoscope-VR assembly section](#Mesoscope-vr-assembly).\n\n### Zaber Motors\nAll brain activity recordings with the Mesoscope require the animal to be head-fixed. To orient head-fixed animals on \nthe Virtual Reality treadmill (running wheel) and promote task performance, the system uses three groups of motors \ncontrolled through Zaber motor controllers. The first group, the **HeadBar**, is used to position the animal\u2019s head in \nZ, Pitch, and Roll axes. Together with the movement axes of the Mesoscope, this allows for a wide range of \nmotions necessary to align the Mesoscope objective with the brain imaging plane. The second group of \nmotors, the **LickPort**, controls the position of the water delivery port (tube) (and sensor) in X, Y, and Z axes. This\nis used to ensure all animals have comfortable access to the water delivery tube, regardless of their head position.\nThe third group of motors, the **Wheel**, controls the position of the running wheel in the X-axis relative to the \nhead-fixed animal\u2019s body and is used to position the animal on the running wheel to promote good running behavior.\n\nThe current snapshot of Zaber motor configurations used in the lab, alongside the motor parts list and electrical wiring\ninstructions is available \n[here](https://drive.google.com/drive/folders/1SL75KE3S2vuR9TTkxe6N4wvrYdK-Zmxn?usp=drive_link).\n\n**Warning!** Zaber motors have to be configured correctly to work with this library. To (re)configure the motors to work\nwith the library, apply the setting snapshots from the link above via the \n[Zaber Launcher](https://software.zaber.com/zaber-launcher/download) software. Make sure you read the instructions in \nthe 'Applying Zaber Configuration' document for the correct application procedure.\n\n**Although this is highly discouraged, you can also edit the motor settings manually**. To configure the motors\nto work with this library, you need to overwrite the non-volatile User Data of each motor device (controller) with\nthe data expected by this library:\n1. **User Data 0**: Device CRC Code. This variable should store the CRC32-XFER checksum of the device\u2019s name \n (user-defined name). During runtime, the library generates the CRC32-XFER checksum of each device\u2019s name and compares\n it against the value stored inside the User Data 0 variable to ensure that each device is configured appropriately to\n work with the sl-experiment library. **Hint!** Use the `sl-crc` console command to generate the CRC32-XFER checksum \n for each device during manual configuration, as it uses the same code as used during runtime and, therefore, \n guarantees that the checksums will match.\n2. **User Data 1**: Device ShutDown Flag. This variable is used as a secondary safety measure to ensure each device has \n been properly shut down during previous runtimes. As part of the manual device configuration, make sure that this \n variable is set to **1**. Otherwise, the library will not start any runtime that involves that Zaber motor. During \n runtime, the library sets this variable to 0, making it impossible to use the motor without manual intervention again\n if the runtime interrupts without executing the proper shut down sequence that sets the variable back to 1.\n **Warning!** It is imperative to ensure that all motors are parked at positions where they are **guaranteed** to \n successfully home after power cycling before setting this to 1. Otherwise, it is possible for some motors to collide\n with other system components and damage the acquisition system.\n3. **User Data 10**: Device Axis Type Flag. This variable should be set to **1** for motors that move on a linear axis \n and **0** for motors that move on a rotary axis. This static flag is primarily used to support proper unit \n conversions and motor positioning logic during runtimes.\n4. **User Data 11**: Device Park Position. This variable should be set to the position, in native motor units, where \n the device should be moved as part of the 'park' command and the shut-down sequence. This is used to position all \n motors in a way that guarantees they can be safely 'homed' at the beginning of the next runtime. Therefore, each\n park position has to be selected so that each motor can move to their 'home' sensor without colliding with any other\n motor **simultaneously** moving towards their 'home' position. **Note!** The lick-port uses the 'park' position as \n the **default** imaging position. During runtime, it will move to the 'park' position if it has no animal-specific \n position to use during imaging. Therefore, make sure that the park position for the lick-port is always set so that \n it cannot harm the animal mounted in the Mesoscope enclosure while moving to the park position from any other \n position.\n5. **User Data 12** Device Maintenance Position. This variable should be set to the position, in native motor units, \n where the device should be moved as part of the 'maintain' command. Primarily, this position is used during water \n delivery system calibration and the running-wheel surface maintenance. Typically, this position is calibrated to\n provide easy access to all hardware components of the system by moving all motors as far away from each other as \n reasonable.\n6. **User Data 13**: Device Mount Position. This variable should be set to the position, in native motor units, where \n the device should be moved as part of the 'mount' command. For the lick-port, this position is usually far away from \n the animal, which facilitates mounting and unmounting the animal from the rig. For the head-bar and the wheel motor \n groups, this position is used as the **default** imaging position. Therefore, set the head-bar and the wheel 'mount'\n positions so that any (new) animal can be comfortably and safely mounted in the Mesoscope enclosure.\n\n### Behavior Cameras\nTo record the animal\u2019s behavior, the system uses a group of three cameras. The **face_camera** is a high-end\nmachine-vision camera used to record the animal\u2019s face with approximately 3-MegaPixel resolution. The **left-camera** \nand the **right_camera** are 1080P security cameras used to record the body of the animal. Only the data recorded by the\n**face_camera** is currently used during data processing and analysis, but the data from all available cameras is saved \nduring acquisition. To interface with the cameras, the system leverages customized\n[ataraxis-video-system](https://github.com/Sun-Lab-NBB/ataraxis-video-system) bindings.\n\nSpecific information about the cameras and related imaging hardware, as well as the snapshot of the configuration \nparameters used by the **face_camera**, is available \n[here]https://drive.google.com/drive/folders/1l9dLT2s1ysdA3lLpYfLT1gQlTXotq79l?usp=sharing).\n\n### MicroControllers\nTo interface with all other hardware components **other** than cameras and Zaber motors, the Mesoscope-VR system uses \nTeensy 4.1 microcontrollers with specialized \n[ataraxis-micro-controller](https://github.com/Sun-Lab-NBB/ataraxis-micro-controller) code. Currently, The system \nuses three isolated microcontroller subsystems: **Actor**, **Sensor**, and **Encoder**.\n\nFor instructions on assembling and wiring the electronic components used in each microcontroller system, as well as the \ncode running on each microcontroller, see the \n[microcontroller repository](https://github.com/Sun-Lab-NBB/sl-micro-controllers).\n\n### Virtual Reality Task Environment (Unity)\nThe task environment used in all Mesoscope-VR experiments is rendered and controlled by the Unity game engine. To make \nUnity work with this library, each project-specific Unity task must use the bindings and assets released as part of the \n[Unity-tasks repository](https://github.com/Sun-Lab-NBB/Unity-tasks). Follow the instructions from that repository to \nset up Unity Game engine to interface with this library and to create new virtual task environments.\n\n**Note!** This library does not contain tools to initialize Unity Game engine. The desired Virtual Reality task\nhas to be started ('armed') ***manually*** before entering the main runtime (data acquisition session) cycle. The main \nUnity repository contains more details about starting the virtual reality tasks when running experiments. During \nCLI-driven experiment runtimes, the library instructs the user when to 'arm' the Unity game engine.\n\n### Google Sheets API Integration\n\nThis library interacts with the shared Google Sheet files used in the Sun lab to track and communicate certain \ninformation about the animals that participate in all projects. Currently, this includes two files: the **surgery log** \nand the **water restriction log**. Primarily, this integration is used to ensure that all information about each \nexperiment subject (animal) is stored in the same location (on the long-term storage machine(s)). Additionally, it is \nused in the lab to automate certain data logging tasks.\n\n#### Setting up Google Sheets API Access\n\n**If you already have a service Google Sheets API account, skip to the next section.** Most lab members can safely \nignore this section, as all service accounts are managed at the acquisition-system level, rather than individual lab \nmembers.\n\n1. Log into the [Google Cloud Console](https://console.cloud.google.com/welcome). \n2. Create a new project.\n3. Navigate to APIs & Services \u2192 Library and enable the Google Sheets API for the project. \n4. Under IAM & Admin \u2192 Service Accounts, create a service account. This will generate a service account ID in the format\n of `your-service-account@gserviceaccount.com`.\n5. Use Actions \u2192 Manage Keys and, if a key does not already exist, create a new key and download it in JSON format. \n This key is then used to access the Google Sheets.\n\n#### Adding Google Sheets Access to the Service Account\nTo access the **surgery log** and the **water restriction log** Google Sheets as part of this library runtime, create \nand share these log files with the email of the service account created above. The service account requires **Editor** \naccess to both files.\n\n**Note!** This feature requires that both log files are formatted according to the available Sun lab templates. \nOtherwise, the parsing algorithm will not behave as expected, leading to runtime failures. Additionally, both log files \nhave to be pre-filled in advance, as the processing code is not allowed to automatically generate new table (log) rows.\n**Hint!** Currently, it is advised to pre-fill the data a month in advance. Since most experiments last for at most a \nmonth, this usually covers the entire experiment period for any animal.\n\n### ScanImage PC Assets\nAs mentioned above, the ScanImagePC is largely assembled and configured by external contractors. However, the PC \nrequires additional assets and configuration post-assembly to make it compatible with sl-experiment-managed runtimes.\n\n#### File System Access\nTo support the sl-experiment runtime, the ScanImagePC\u2019s filesystem must be accessible to the **VRPC** via the Server \nMessage Block version 3 (SMB3) or equivalent protocol. Since ScanImagePC uses Windows, it is advised to use the SMB3 \nprotocol, as all Windows machines support it natively with minimal configuration. As a minimum, the ScanImagePC must be \nconfigured to share the root Mesoscope output folder with the VRPC over the local network. This is required to both \nfetch the data acquired by the Mesoscope during preprocessing and to control the Mesoscope during runtime.\n\n#### Default Screenshot Directory\nDuring runtime, the sl-experiment library prompts the user to generate a screenshot of the ScanImagePC desktop and \nplace it in the network-shared mesoscope data folder (see above). The screenshot is used to store the information about\nthe red-dot alignment, the acquisition parameters, and the state of the imaging plane at the beginning of each session. \nThe library is statically configured to fetch the screenshot from the shared folder and will not look in any other \ndirectories. Therefore, it is advised to reconfigure the default output directory used by the 'Win + PrntSc' command on \nthe ScanImagePC to save the screenshots into the shared Mesoscope output directory.\n\n#### MATLAB Assets\nScanImage software is written in MATLAB and controls all aspects of Mesoscope data acquisition. While each runtime \nrequires the experimenter to manually interface with the ScanImage GUI during Mesoscope preparation, all data \nacquisition runtimes using the sl-experiment library require the user to call the **setupAcquisition** MATLAB function\navailable from [mesoscope assets repository](https://github.com/Sun-Lab-NBB/sl-mesoscope-assets). This function carries\nout multiple runtime-critical tasks, including setting up the acquisition, generating and applying the online motion \ncorrection algorithm, and allowing the VRPC to control the Mesoscope via creating or removing binary marker files.\n\nTo configure MATLAB to access the mesoscope assets, git-clone the entire repository to the ScanImagePC. Then, follow the\ntutorials [here](https://www.mathworks.com/help/matlab/matlab_env/add-remove-or-reorder-folders-on-the-search-path.html)\nand add the path to the root mesoscope assets folder to MATLAB\u2019s search path. MATLAB will then be able to use all \nfunctions from that repository, including the setupAcquisition function. The repository also contains the online \nmotion estimation and correction assets developed in the [Pachitariu and Stringer lab](https://mouseland.github.io/), \nwhich are required for the setupAcquisition function to work as expected.\n\n### Mesoscope-VR Assembly\n***This section is currently a placeholder. Since the final Mesoscope-VR system design is still a work in progress, it \nwill be populated once the final design implementation is constructed and tested in the lab.***\n\nThe Mesoscope-VR assembly mostly consists of two types of components. First, it includes custom components manufactured \nvia 3D-printing or machining (for metalwork). Second, it consists of generic components available from vendors such as \nThorLabs, which are altered in workshops to fit the specific requirements of the Mesoscope-VR system. The blueprints and\nCAD files for all components of the Mesoscope-VR systems, including CAD renders of the assembled system, are available \n[here](https://drive.google.com/drive/folders/1Oz2qWAg3HkMqw6VXKlY_c3clcz-rDBgi?usp=sharing).\n\n___\n\n## Acquired Data Structure and Management\n\nThe library defines a fixed structure for storing all acquired data which uses a 4-level directory tree hierarchy: \n**root** (volume), **project**, **animal**, and **session**. This structure is reused by all acquisition systems, and \nit is maintained across all long-term storage destinations. After each data acquisition runtime (session), all \nraw data is stored under the **root/project/animal/session/raw_data** directory stored on one or more machines mentioned\nbelow. After each data processing pipeline runtime, all processed data generated by that pipeline is stored under the \n**root/project/animal/session/processed_data**.\n\nCurrently, each data acquisition system in the lab uses at least three machines: \n1. The **main data acquisition PC** is used to acquire and preprocess the data. For example, the *VRPC* of the \n *Mesoscope-VR* system is the main data acquisition PC for that system. This PC is used to both **acquire** the data \n and, critically, to **preprocess** the data before it is moved to the long-term storage destinations.\n2. The **BioHPC compute server** is the main long-term storage destination and the machine used to process and \n analyze the data. This is a high-performance computing server owned by the lab that can be optionally extended by \n renting additional nodes from Cornell\u2019s BioHPC cluster. The BioHPC server stores both the raw data and the processed \n data generated by all Sun lab processing pipelines.\n3. The **Synology NAS** is the back-up 'cold' long-term storage destination. It only stores the raw data and is \n physically located in a different building from the main BioHPC compute server to provide data storage redundancy. It\n is only used to back up raw data and is generally not intended to be accessed unless the main data storage becomes \n compromised for any reason.\n\n***Critical!*** Each data acquisition system is designed to **mount the BioHPC and the NAS to the main acquisition PC \nfilesystem using the Server Message Block 3 (SMB3) protocol**. Therefore, each data acquisition system operates on the \nassumption that all storage component filesystems are used contiguously and can be freely accessed by the main \nacquisition PC OS.\n\n***Note!*** The library tries to maintain at least two copies of data for long-term storage: one on the NAS and the \nother on the BioHPC server. It is configured to purge redundant data from the data acquisition system machines if\nthe data has been **safely** moved to the long-term storage destinations. The integrity of the moved data is verified\nusing xxHash-128 checksum before the data is removed from the data-acquisition system.\n\n### Root Directory (Volume)\nAll data acquisition systems, the Synology NAS and the BioHPC server keep **ALL** Sun lab projects in the same **root** \ndirectory. The BioHPC server uses **two roots**, one for the raw data and the other for the processed data. This \nseparation is due to the BioHPC server using a combination of fast NVME drives and slow HDD drives to optimize storage \ncost and data processing performance. The exact location and name of the root directory on each machine is arbitrary \nbut should generally remain fixed (unchanging) over the entire lifetime of that specific machine. \n\n### Project Directory\nWhen a new project is created, a **project** directory **named after the project** is created under the **root** \ndirectory of the main data acquisition machine, the Synology NAS, and both the raw and processed data BioHPC volumes. \nDepending on the host machine, this project directory may contain further subdirectories.\n\nAll data acquisition systems also create a **configuration** subdirectory under the root project directory. This \ndirectory stores all supported experiment configurations for the project. Each 'sl-experiment' CLI command call \nsearches the configuration directory for the .yaml file with the name of the target experiment to load the experiment\ndata.\n\n### Animal Directory\nWhen the library is used to acquire data for a new animal, it generates a new **animal** directory under the **root** \nand **project** directory combination. The directory uses the ID of the animal, as its name. Depending on the host\nmachine, this animal directory may contain further subdirectories. \n\nAll data acquisition systems also create a **persistent_data** subdirectory under the root animal directory, which is \nused to store data that is reused between data acquisition sessions.\n\n***Critical!*** The current Sun lab convention stipulates that all animal IDs should be numeric. While some library \ncomponents do accept strings as inputs, it is expected that all animal IDs only consist of positive integers. Failure to\nadhere to this naming convention can lead to runtime errors and unexpected behavior of all library components!\n\n### Session Directory\nEach time the library is used to acquire data, a new session directory is created under the **root**, **project** and \n**animal** directory combination. The session name is derived from the current ***UTC*** timestamp at the time of the \nsession directory creation, accurate to ***microseconds***. Primarily, this naming format was chosen to make all \nsessions acquired by the same acquisition system have unique and chronologically sortable names. The session name format\nfollows the order of **YYYY-MM-DD-HH-MM-SS-US**.\n\n### Raw Data and Processed Data:\nAll data acquired by this library is stored under the **raw_data** subdirectory, generated for each session. Overall, \nan example path to the acquired (raw) data can therefore look like this: \n`/media/Data/Experiments/Template/666/2025-11-11-05-03-234123/raw_data/`. \n\nSimilarly, all Sun lab data processing pipelines generate new files and subdirectories under the **processed_data** \nsubdirectory for each session. An example path to the processed_data directory can therefore look like this: \n`/server/sun_data/Template/666/2025-11-11-05-03-234123/processed_data`.\n\n***Note!*** This library treats **both** newly acquired and preprocessed data as **raw**. This is because preprocessing \n**does not change the content of the data**. Instead, preprocessing uses lossless compression to more efficiently \npackage the data for transmission and can at any time be converted back to the original format. Processing the data, on \nthe other hand, generates additional data and / or modifies the processed data values in ways that may not necessarily \nbe reversible.\n\n### Shared Raw Data\n\nThe section below briefly lists the data acquired by **all** Sun lab data acquisition systems. Note, each acquisition \nsystem also generates **system-specific** data, which is listed under acquisition-system-specific sections available \nafter this section.\n\n**Note!** For information about the **processed** data, see the \n[main data processing library](https://github.com/Sun-Lab-NBB/sl-forgery).\n\nAfter acquisition and preprocessing, the **raw_data** folder of each acquisition system will, as a minimum, contain the \nfollowing files and subdirectories:\n1. **ax_checksum.txt**: Stores the xxHash-128 checksum used to verify data integrity when it is transferred to the \n long-term storage destination. The checksum is generated before the data leaves the main data acquisition system PC\n and, therefore, accurately captures the final state of the raw data before it enters storage.\n2. **hardware_state.yaml**: Stores the snapshot of the dynamically calculated parameters used by the data acquisition \n system modules during runtime. These parameters are recalculated at the beginning of each data acquisition \n session and are rounded and stored using the appropriate floating point type (usually fp64) to minimize floating \n point rounding errors. This improves the quality of processed data by ensuring that the processing and the \n acquisition pipelines use the same floating-point parameter values. This file is also used to determine which modules\n were used during runtime and, consequently, which data can be parsed from the .npz log files generated at runtime\n (see below).\n3. **session_data.yaml**: Stores information necessary to maintain the same session data structure across all machines \n used during data acquisition and long-term storage. This file is used by all Sun lab libraries as an entry point for \n working with session\u2019s data. The file also includes all available information about the identity and purpose of the \n session and can be used by human experimenters to identify the session. Since version 3.0.0, the file also stores \n the version of the sl-experiment and Python that were used to acquire the data.\n4. **session_descriptor.yaml**: Stores session-type-specific information, such as the task parameters and experimenter \n notes. The contents of the file are overall different for each session type, although some fields are reused by all \n sessions. The contents for this file are partially written by the library (automatically) and, partially, by the\n experimenter (manually, at the end of each session). At the end of each runtime, a copy of the descriptor file is \n cached inside the *persistent_data* directory of the animal, replacing any already existing copy. This is used to \n optionally restore certain runtime configuration parameters between session types that support this functionality.\n5. **surgery_metadata.yaml**: Stores the data on the surgical intervention(s) performed on the animal that participated \n in the session. This data is extracted from the **surgery log** Google Sheet and, for most animals, should be the \n same across all sessions.\n6. **system_configuration.yaml**: Stores the configuration parameters of the data acquisition system that generated the\n session data. This is a snapshot of **all** dynamically addressable configuration parameters used by the system. \n When combined with the assembly instructions and the appropriate sl-experiment library version, it allows completely \n replicating the data acquisition system used to acquire the session data.\n7. **behavior_data**: Stores compressed .npz log files that contain all non-video behavior data acquired by the system. \n This includes all messages sent or received by each microcontroller, the timestamps for the frames acquired by \n each camera and (if applicable) the main brain activity recording device (e.g.: Mesoscope). These logs also include \n session metadata, such as trials, task conditions, and system and runtime state transitions. Although the exact \n content of the behavior data folder can differ between acquisition systems, all systems used in the lab generate some\n form of non-video behavior data.\n8. **camera_data**: Stores the behavior videos recorded by video cameras used by the acquisition system. While not \n technically required for every system, all current Sun lab data acquisition systems use one or more cameras to \n acquire behavior videos.\n9. **experiment_configuration.yaml**: This file is only created for **experiment** sessions. It stores the \n configuration of the experiment task performed by the animal during runtime. The contents of the file differ for each\n data acquisition system, but each system generates a version of this file. The file contains enough information to \n fully replicate the experiment runtime on the same acquisition system and to process and analyze the acquired data.\n10. **telomere.bin**: This marker is used to communicate whether the session data is **complete**. Incomplete sessions\n appear due to session acquisition runtime being unexpectedly interrupted. This is rare and is typically the result of\n major emergencies, such as sudden power loss or other unforeseen events. Incomplete sessions are automatically \n excluded from automated data processing and require manual user intervention to assess the usability of the session \n for analysis. This marker file is created **exclusively** as part of session data preprocessing, based on the value \n of the **session_descriptor.yaml** file 'incomplete' field.\n11. **integrity_verification_tracker.yaml**: This tracker file is used internally to run and track the outcome of the \n remote data verification procedure. This procedure runs as part of moving the data to the long-term storage \n destinations to ensure the data is transferred intact. Users can optionally check the status of the verification by \n accessing the data stored inside the file. **Note!** This file is added **only!** to the raw_data folder stored on\n the BioHPC server.\n\n### Shared Temporary Data\nThe sl-experiment library additionally uses the following temporary marker files and directories which are cleared \nbefore the raw data is transmitted to the long-term storage destinations:\n1. **nk.bin**: This marker is automatically cached to disk as part of creating a new session data hierarchy. Each \n runtime removes this marker file when it successfully completes its runtime preparation (the main start() method call\n of the runtime management class). If this marker exists when the runtime enters the shutdown cycle, this indicates \n that the runtime encountered a fatal error during startup and had to be terminated early. In this case, the session\u2019s\n data is silently deleted, as uninitialized runtime necessarily does not contain any valid data. This is used to \n automatically declutter the data acquisition and long-term storage PCs to only keep valid sessions.\n2. **behavior_data_log**: All behavior log entries are initially saved as individual .npy files. Each .npy file stores \n a serialized log message in the form of an uint8 (byte) NumPy array. Since messages are cached to disk as soon as \n they are received by the DataLogger to minimize data loss in case of emergency shutdowns, the temporary \n behavior_data_log directory is used to store these messages during runtime. Frequently, the directory accumulates \n millions of .npy files at runtime, making it challenging for human operators to work with the data. During \n preprocessing, individual .npy files are grouped by their source (what made the log entry, e.g.: VideoSystem, \n MicroController, Data Acquisition System, etc.) and are compressed into .npz archives, one for each source. The \n .npz archives are then moved to the *behavior_Data* folder, and the *behavior_data_log* with the individual \n .npy files is deleted to conserve disk space.\n3. **ubiquitin.bin**: This marker file is created during *preprocessing* runtime to mark directories that\n are no longer needed for deletion. Specifically, when preprocessing safely moves the acquired data to long-term \n storage destinations, it marks all now-redundant raw data from all acquisition system machines (PCs) for removal to \n optimize disk usage. At the end of preprocessing, a specialized *purge* runtime is executed to discover and remove \n all directories marked for deletion with the ubiquitin.bin marker files.\n\n### Mesoscope-VR System Data\n\nThe Mesoscope-VR system instantiates a directory hierarchy both on the VRPC and the ScanImagePC. Below is the list of \nfiles and directories found on each of these machines.\n\n#### Raw Data\n\nThe Mesoscope-VR system generates the following files and directories, in addition to those discussed in the shared \nraw data section on the VRPC:\n1. **mesoscope_data**: Stores all Mesoscope-acquired data (frames, motion estimation files, etc.). Since Mesoscope data\n is only acquired for **experiment** sessions, this directory is kept empty for all other session types. During \n preprocessing, the folder contents are organized in a way to automatically work with \n [sl-suite2p](https://github.com/Sun-Lab-NBB/suite2p) single-day processing pipeline. All Mesoscope data is intended \n to be processed with the sl-suite2p library.\n2. **zaber_positions.yaml**: Stores the snapshot of the positions used by the HeadBar, LickPort, and Wheel Zaber motor \n groups, taken at the end of the session\u2019s data acquisition. All positions are stored in native motor units. This \n file is created for all session types supported by the Mesoscope-VR system. As a backup, a copy of this file is also\n generated at the beginning of each session runtime. This allows recovering from critical runtime failures, where the\n runtime may not be able to generate this snapshot. During both snapshot generation timepoints, a copy of the \n generated snapshot file is also cached inside the *persistent_data* directory of the animal to support restoring the\n motors to the same position during the next session.\n3. **mesoscope_positions.yaml**: Stores the snapshot of the physical Mesoscope objective position in X, Y, Z, and Roll \n axes, the virtual ScanImage axes (Fast Z, Tip, Tilt), and the laser power at the sample, taken at the end of the \n session\u2019s data acquisition. **Note!** This file relies on the experimenter updating the stored positions if they are\n changed during runtime. It is only created for window checking and experiment sessions. A copy of this snapshot file\n is also saved to the *persistent_data* directory of the animal to support restoring the Mesoscope to the same imaging\n field during the next session.\n4. **window_screenshot.png**: Stores the screenshot of the ScanImagePC screen. The screenshot should contain the image\n of the red-dot alignment, the view of the target cell layer, the Mesoscope position information, and the data \n acquisition parameters. Primarily, the screenshot is used by experimenters to quickly reference the imaging quality \n from each experiment session. This file is only created for window checking and experiment sessions. A copy of this \n file is saved to the *persistent_data* directory of the animal to help the user to realign the red-dot to a similar \n position during the next session.\n\n### ScanImagePC\n\nAll Mesoscope-VR system data on the ScanImagePC is stored under the user-defined ScanImagePC root directory, which is \nexpected to be mounted to the VRPC via the SMB or similar protocol. Under that root directory, the system creates the \nfollowing directories and files:\n1. **mesoscope_data**: This directory stores all Mesoscope-acquired data for the currently running session. The \n *setupAcquisition* MATLAB function configures ScanImage software to output all data to the mesoscope_data directory, \n which is shared by all sessions, animals, and projects. This allows using the same static output path for all \n ScanImage acquisitions.\n2. **session-specific mesoscope_data**: At the end of each runtime, the Mesoscope-VR system ***renames*** the \n mesoscope_data directory to include the session name (id). Then, it generates an empty mesoscope_data directory for\n the next runtime. This way, all data of each completed session is stored under a unique directory named after that \n session. This step is crucial for data preprocessing, which identifies the session data directory and pulls it over\n to the VRPC based on the session name (id).\n3. **persistent_data**. This directory is created for each unique **project** and **animal** combination, \n similar to the data structure created by sl-experiment on the main acquisition system PC. This directory contains \n the **first day** MotionEstimator.me and fov.roi files. These files are typically reused by all following data \n acquisition sessions to restore the imaging field to the same location as used on the first day. The full path to \n the persistent_data directory would typically look like **root/project/animal/persistent_data**.\n\n#### Mesoscope-VR Temporary Data\n\nThe Mesoscope-VR system also generates the following temporary files and directories during runtime:\n1. **raw_mesoscope_data**: Stores uncompressed .TIFF stacks fetched by the VRPC from the ScanImagePC. This is done \n as part of data preprocessing to collect all data on the VRPC before executing individual preprocessing subroutines.\n The .TIFF stacks are then re-compressed using the Limited Error Raster Compression (LERC) scheme and are saved to the\n *mesoscope_data* folder. Once this process completes successfully, the *raw_mesoscope_data* with all raw files is \n deleted to conserve disk space.\n2. **kinase.bin**: This marker is created in the *mesoscope_data* ScanImagePC directory. During runtime, the \n *setupAcquisition* MATLAB function monitors the *mesoscope_data* directory for the presence of this marker file. If \n the file is present, the function triggers the Mesoscope data acquisition. If the file is absent, the function stops\n the Mesoscope data acquisition until the file is re-created. As such, the VRPC uses this marker file to start \n and stop the Mesoscope data acquisition during normal operation.\n3. **phosphatase.bin**: This marker works similar to the *kinase.bin* marker, but is used by the VRPC to abort the \n ScanImagePC runtime at any point. When ScanImagePC is waiting for the *kinase.bin* marker to be created for the first\n time, stopping the Mesoscope acquisition necessarily requires the kinase marker to be first created and then removed,\n triggering a brief Mesoscope frame acquisition. Creating the *phosphatase.bin* marker instead triggers the \n ScanImagePC to end the runtime without waiting for the *kinase.bin* marker, effectively aborting the runtime without\n acquiring any frames.\n\n--- \n\n## Acquiring Data in the Sun Lab\n\nAll user-facing library functionality is realized through a set of Command Line Interface (CLI) commands automatically \nexposed when the library is pip-installed into a python environment. Some of these commands take additional arguments \nthat allow further configuring their runtime. Use `--help` argument when calling any of the commands described below to\nsee the list of supported arguments together with their descriptions and default values.\n\nTo use any of the commands described below, activate the python environment where the libray is installed, e.g., with \n`conda activate MYENV` and type one of the commands described below.\n\n***Warning!*** Most commands described below use the terminal to communicate important runtime information to the user \nor request user feedback. **Make sure you carefully read every message printed to the terminal during runtime**. \nFailure to do so may damage the equipment or harm the animal!\n\n### Step 0: Configuring the Data Acquisition System\n\nBefore acquiring data, each acquisition system has to be configured. This step is done in addition to assembling \nthe system and installing the required hardware components. Typically, this only needs to be done when the acquisition \nsystem configuration or hardware changes, so most lab members can safely skip this step.\n\nUse `sl-create-system-config` command to generate the system configuration file. As part of its runtime, the command \nconfigures the host machine to remember the path to the generated configuration file, so all future sl-experiment \nruntimes on that machine will automatically load and use the appropriate acquisition-system configuration parameters.\n\n***Note!*** Each acquisition system uses unique configuration parameters. Additionally, the sl-experiment library always\nassumes that any machine (PC) can only be used by a single data-acquisition system (is permanently a part of that \nacquisition system). Only the **main** PC of the data acquisition system (e.g.: the VRPC of the Mesoscope-VR system) \nthat runs the sl-experiment library should be configured via this command.\n\nFor information about the available system configuration parameters, read the *API documentation* of the appropriate \ndata-acquisition system available from the [sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library.\nSpecifically, all data acquisition system configuration parameters are defined in the *SystemConfiguration* class named\nafter that system.\n\nAdditionally, since every data acquisition system requires access to the BioHPC server for long-term data storage, it \nneeds to be provided with server access credentials. The credentials are stored inside the 'server_credentials.yaml'\nfile, which is generated by using the `sl-create-server-credentials` command. **Note!** The path to the generated and\nfilled credential file has to be provided to the acquisition system before runtime by editing the acquisition-system \nconfiguration file created via the command discussed above.\n\n### Step 1: Creating a Project\n\nAll data acquisition sessions require a valid project to run. To create a new project, use the `sl-create-project`\ncommand. This command can only be called on the main PC of a properly configured data-acquisition system (see Step 0 \nabove). As part of its runtime, this command generates the root project directory on all machines that make up the \ndata acquisition systems and long-term storage destinations.\n\n### Step 2: Creating an Experiment\n\nAll projects that involve scientific experiments also need to define at least one **experiment configuration**. \nExperiment configurations are unique for each data acquisition system and are stored inside .yaml files named after the\nexperiment. To generate a new experiment configuration file, use the `sl-create-experiment` command. This command \ngenerates a **precursor** experiment configuration file inside the **configuration** subdirectory, stored under the root\nproject directory on the main PC of the data acquisition system.\n\nFor information about the available experiment configuration parameters in the precursor file, read the \n*API documentation* of the appropriate data-acquisition system available from the \n[sl-shared-assets](https://github.com/Sun-Lab-NBB/sl-shared-assets) library.\n\n### Step 3: Maintaining the Acquisition System\n\nAll acquisition systems contain modules that require frequent maintenance. Most of these modules are unique to each \nacquisition system. Therefore, this section is further broken into acquisition-system-specific subsections.\n\n#### Mesoscope-VR\n\nThe Mesoscope-VR system contains two modules that require frequent maintenance: the **water delivery system** and the \n**running wheel**. To facilitate the maintenance of these modules, the sl-experiment library exposes the `sl-maintain`\ncommand.\n\nThis command is typically called at least twice during each day the system is used to acquire data. First, it is used \nat the beginning of the day to prepare the Mesoscope-VR system for runtime by filling the water delivery system. Second,\nit is used at the end of each day to empty the water delivery system.\n\nLess frequently, this command is used to re-calibrate the water delivery system, typically, as a result of replacing \nsystem components, such as tubing or the valve itself. The command is also occasionally used to replace the \nsurface material of the running wheel, which slowly deteriorates as the wheel is used.\n\nThis command can also facilitate cleaning the wheel, which is typically done before and after each runtime to remove \nany biological contaminants left by each animal participating in experiment or training runtimes.\n\n***Note!*** Since this runtime fulfills multiple functions, it uses an 'input'-based terminal interface to accept \nfurther commands during runtime. To prevent visual bugs, the input does not print anything to the terminal and appears \nas a blank new line. If you see a blank new line with no terminal activity, this indicates that the system is ready \nto accept one of the supported commands. All supported commands are printed to the terminal as part of the runtime \ninitialization.\n\nSupported vr-maintenance commands:\n1. `open`. Opens the water delivery valve.\n2. `close`. Closes the water delivery valve.\n3. `close_10`. Closes the water delivery valve after a 10-second delay.\n4. `reference`. Triggers 200 valve pulses with each pulse calibrated to deliver 5 uL of water. This command is used to\n check whether the valve calibration data stored in the system_configuration.yaml file is accurate. This should\n be done at the beginning of each day the system is used to acquire data. The reference runtime should overall \n dispense ~ 1 ml of water. If the reference procedure does not dispense the expected volume of water, the system \n needs to be recalibrated.\n5. `calibrate_15`. Runs 200 valve pulses, keeping the valve open for 15 milliseconds at each pulse. This is used to \n generate valve calibration data.\n6. `calibarte_30`. Same as above, but uses 30-millisecond pulses.\n7. `calibrate_45`. Same as above, but uses 45-millisecond pulses.\n8. `calibrate_60`. Same as above, but uses 60-millisecond pulses.\n9. `lock`. Locks the running wheel (engages running-wheel break).\n10. `unlock`. Unlocks the running wheel (disengages running wheel break).\n\n### Step 4: Acquiring Data\n\nEach acquisition system supports one or more distinct types of data-acquisition sessions (runtimes). As a minimum, this\nincludes an 'experiment' session type, which is the primary use case for all acquisition systems in the lab. Some \nsystems may also support one or more training session types, which often do not acquire any brain activity data, but \notherwise behave similar to experiment sessions.\n\n#### Mesoscope-VR\n\nThe Mesoscope-VR system supports four types of runtime sessions, each associated with a unique runtime command exposed\nby the sl-experiment library:\n1. **Window Checking**: This session is executed by calling the `sl-check-window` command. It guides the user through \nfinding the imaging plane and generating the reference MotionEstimator.me and zstack.tiff files for the checked \nanimal. This session is typically used ~2\u20133 weeks after the surgical intervention and before any training or experiment \nsessions to assess the quality of the intervention and the suitability of including the animal in experiment cohorts.\n2. **Lick Training**: This session is executed by calling the `sl-lick-train` command. All animals that participate in \nMesoscope-VR experiments undergo a two-stage training protocol, with lick training being the first stage. During this\nruntime, the animals are head fixed in the Mesoscope enclosure for ~ 20 minutes. The primary goal of the runtime is \nto teach the animals to lick at the water tube to consume water rewards and associate the sound tone emitted at reward \ndelivery with water coming out of the tube. During runtime, the running wheel is locked, so the animals cannot run.\n3. **Run Training**: This session is executed by calling the `sl-run-train` command. This is the second stage of the \nmandatory two-stage Mesoscope-VR training protocol. During this runtime, the animals are head fixed in the Mesoscope \nenclosure for ~ 40 minutes. The primary goal of the runtime is to teach the animals to run on the wheel while head-fixed\nand to associate running with receiving water rewards. During runtime, the running wheel is unlocked, but the Virtual \nReality screens are kept off, ensuring that the animal is not exposed to any visual cues until the first experiment day.\n4. **Mesoscope Experiment**: This session is executed by calling the `sl-experiment` command. This type of session is \ndesigned to execute the experiment specified in the target *experiment_configuration.yaml* file (see above). The system\nsupports varying experiment configurations and Virtual Reality environments, offering experimenters the flexibility to \nrun different projects using the same set of APIs and system hardware.\n\n### Step 5: Preprocessing and Managing Data\n\nAll acquisition systems support two major ways of handling the session data acquired at runtime. For most runtimes, the \nchoice of how to handle the data is made as part of the acquisition system shutdown sequence. However, in the case of \nunexpected runtime terminations, all data preprocessing steps can also be executed manually by calling the appropriate\nCLI command.\n\nThe first and most commonly used way is to **preprocess** the acquired data. This can be done manually by calling the \n`sl-preprocess` command. Preprocessing consists of two major steps. The first step pulls all available data to the \nmain data acquisition system machine (PC) and re-packages (re-compresses) the data to reduce its size without loss. The \nsecond step distributes (pushes) the data to **multiple** long-term storage destinations, such as the NAS and the \nBioHPC server.\n\n**Critical!** It is imperative that **all** valid data acquired in the lab undergoes preprocessing \n**as soon as possible**. Only preprocessed data is stored in a way that maximizes its safety by using both \nredundancy and parity. Data that is not preprocessed may be **lost** in the case of emergency, which is considerably \nless likely for the preprocessed data.\n\nWhen preprocessed data is successfully and safely pushed to long-term storage destinations, the preprocessing runtime \nmarks all raw data directories for that session stored on any acquisition system machine for deletion by creating the \n*ubiquitin.bin* marker files. It then calls a function that finds and removes marked directories for **all available\nprojects and animals**. This final step can be triggered manually by calling the `sl-purge` CLI command. To optimize\ndisk space usage on acquisition system machines, it is recommended to call this command at least daily.\n\nThe second way of managing the data is primarily used during testing and when handling interrupted sessions that did \nnot generate any valid data. This involves removing all session data from **both** the data acquisition system and all \nlong-term storage destinations. This runtime is extremely dangerous and, if not used carefully, can \n***permanently delete valid data***. This processing mode can be triggered using the `sl-delete-session-data` CLI \ncommand, although it is not recommended for most users.\n\n---\n\n## API Documentation\n\nSee the [API documentation](https://sl-experiment-api-docs.netlify.app/) for the\ndetailed description of the methods and classes exposed by components of this library, as well as all available \nCLI commands with their arguments\n\n___\n\n## Recovering from Interruptions\nWhile it is not typical for the data acquisition or preprocessing pipelines to fail during runtime, it is possible. The \nlibrary can recover or gracefully terminate the runtime for most code-generated errors, so this is usually not a\nconcern. However, if a major interruption (i.e., power outage) occurs or one of the hardware assets malfunctions during\nruntime, manual intervention is typically required to recover the session data and reset the acquisition system.\n\n### Data acquisition interruption\n\nData acquisition can be interrupted in two main ways, the first being due to an external asset failure, for example, if \nthe ScanImagePC unexpectedly shuts down during Mesoscope-VR system runtime. In this case, the runtime pauses and \ninstructs the user to troubleshoot the issue and then resume the runtime. This type of *soft* interruption is handled \ngracefully during runtime, data processing, and analysis to exclude the data collected during the interruption from the \noutput dataset. Generally, soft interruptions are supported for most external assets, which includes anything not \nmanaged directly by the sl-experiment library and the main data acquisition system PC. While inconvenient, these \ninterruptions do not typically require specialized handling other than recovering and restoring the failed asset.\n\n**Note!** While most soft interruptions typically entail resuming the interrupted runtime, it is also possible to\ninstead terminate the runtime. To do so, execute the `termiante` command via the GUI instead of trying to resume \nthe runtime. In this case, the system attempts to execute a graceful shutdown procedure, saving all valid data in the \nprocess.\n\nThe second way involves interruption due to sl-experiment runtime failures or unexpected shut-downs of the main \nacquisition system PC. In these cases, manual user intervention is typically required to recover the useful data and \nreset the system before the acquisition can be restarted. The handling of such cases often consists of specific steps\nor each supported acquisition system. Typically, these *hard* interruptions are related to major issues, such as \nglobal facility power loss or severe malfunction of sensitive acquisition system components, such as microcontrollers \nand communication cables.\n\n#### Mesoscope-VR\nIf the VRPC runtime unexpectedly interrupts at any point without executing the graceful shutdown, follow these \ninstructions:\n1. If the session involved Mesoscope imaging, shut down the Mesoscope acquisition process and make sure all required \n files (frame stacks, motion estimator data, cranial window screenshot) have been generated and saved to the \n **mesoscope_data** folder.\n2. If necessary, **manually** edit the session_descriptor.yaml, the mesoscope_positions.yaml, and the \n zaber_positions.yaml files to include actual runtime information. Estimate the volume of water delivered at runtime \n by manually reading the water tank level gauge. \n3. Remove the animal from the Mesoscope enclosure. If necessary, use the *Zaber Launcher* app to directly interface with\n Zaber motors and move them in a way that allows the animal to be recovered from the enclosure.\n4. Use Zaber Launcher to **manually move the HeadBarRoll axis to have a positive angle** (> 0 degrees). This is \n critical! If this is not done, the motor will not be able to home during the next session and will instead collide \n with the movement guard, at best damaging the motor and, at worst, the Mesoscope.\n5. Go into the 'Device Settings' tab of the Zaber Launcher, click on each Device tab (NOT motor!) and navigate to its \n User Data section. Then **flip Setting 1 from 0 to 1**. Without this, the library will refuse to operate the Zaber \n Motors during all future runtimes.\n6. If the session involved Mesoscope imaging, **rename the mesoscope_data folder to prepend the session name, using an\n underscore to separate the folder name from the session name**. For example, from mesoscope_data \u2192 \n 2025-11-11-05-03-234123_mesoscope_data. **Critical!** if this is not done, the library may **delete** any leftover \n Mesoscope files during the next runtime and will not be able to properly preprocess the frames for the interrupted\n session during the next step.\n7. Call the `sl-process` command and provide it with the path to the session directory of the interrupted session. This\n will preprocess and transfer all collected data to the long-term storage destinations. This way, you can preserve \n any data acquired before the interruption and prepare the system for running the next session.\n\n### Data preprocessing interruption\nTo recover from an error encountered during preprocessing, call the `sl-process` command and provide it with the path \nto the session directory of the interrupted session. The preprocessing pipeline should automatically resume an \ninterrupted runtime from the nearest checkpoint.\n\n---\n\n## Versioning\n\nThis project uses [semantic versioning](https://semver.org/). For the versions available, see the \n[tags on this repository](https://github.com/Sun-Lab-NBB/sl-experiment/tags).\n\n---\n\n## Authors\n\n- Ivan Kondratyev ([Inkaros](https://github.com/Inkaros))\n- Kushaan Gupta ([kushaangupta](https://github.com/kushaangupta))\n- Natalie Yeung\n- Katlynn Ryu ([katlynn-ryu](https://github.com/KatlynnRyu))\n- Jasmine Si\n\n___\n\n## License\n\nThis project is licensed under the GPL3 License: see the [LICENSE](LICENSE) file for details.\n\n___\n\n## Acknowledgments\n\n- All Sun lab [members](https://neuroai.github.io/sunlab/people) for providing the inspiration and comments during the\n development of this library.\n- The creators of all other projects used in our development automation pipelines and source code \n [see pyproject.toml](pyproject.toml).\n\n---",
"bugtrack_url": null,
"license": "GNU GENERAL PUBLIC LICENSE\n Version 3, 29 June 2007\n \n Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n \n Preamble\n \n The GNU General Public License is a free, copyleft license for\n software and other kinds of works.\n \n The licenses for most software and other practical works are designed\n to take away your freedom to share and change the works. By contrast,\n the GNU General Public License is intended to guarantee your freedom to\n share and change all versions of a program--to make sure it remains free\n software for all its users. We, the Free Software Foundation, use the\n GNU General Public License for most of our software; it applies also to\n any other work released this way by its authors. You can apply it to\n your programs, too.\n \n When we speak of free software, we are referring to freedom, not\n price. Our General Public Licenses are designed to make sure that you\n have the freedom to distribute copies of free software (and charge for\n them if you wish), that you receive source code or can get it if you\n want it, that you can change the software or use pieces of it in new\n free programs, and that you know you can do these things.\n \n To protect your rights, we need to prevent others from denying you\n these rights or asking you to surrender the rights. Therefore, you have\n certain responsibilities if you distribute copies of the software, or if\n you modify it: responsibilities to respect the freedom of others.\n \n For example, if you distribute copies of such a program, whether\n gratis or for a fee, you must pass on to the recipients the same\n freedoms that you received. You must make sure that they, too, receive\n or can get the source code. And you must show them these terms so they\n know their rights.\n \n Developers that use the GNU GPL protect your rights with two steps:\n (1) assert copyright on the software, and (2) offer you this License\n giving you legal permission to copy, distribute and/or modify it.\n \n For the developers' and authors' protection, the GPL clearly explains\n that there is no warranty for this free software. For both users' and\n authors' sake, the GPL requires that modified versions be marked as\n changed, so that their problems will not be attributed erroneously to\n authors of previous versions.\n \n Some devices are designed to deny users access to install or run\n modified versions of the software inside them, although the manufacturer\n can do so. This is fundamentally incompatible with the aim of\n protecting users' freedom to change the software. The systematic\n pattern of such abuse occurs in the area of products for individuals to\n use, which is precisely where it is most unacceptable. Therefore, we\n have designed this version of the GPL to prohibit the practice for those\n products. If such problems arise substantially in other domains, we\n stand ready to extend this provision to those domains in future versions\n of the GPL, as needed to protect the freedom of users.\n \n Finally, every program is threatened constantly by software patents.\n States should not allow patents to restrict development and use of\n software on general-purpose computers, but in those that do, we wish to\n avoid the special danger that patents applied to a free program could\n make it effectively proprietary. To prevent this, the GPL assures that\n patents cannot be used to render the program non-free.\n \n The precise terms and conditions for copying, distribution and\n modification follow.\n \n TERMS AND CONDITIONS\n \n 0. Definitions.\n \n \"This License\" refers to version 3 of the GNU General Public License.\n \n \"Copyright\" also means copyright-like laws that apply to other kinds of\n works, such as semiconductor masks.\n \n \"The Program\" refers to any copyrightable work licensed under this\n License. Each licensee is addressed as \"you\". \"Licensees\" and\n \"recipients\" may be individuals or organizations.\n \n To \"modify\" a work means to copy from or adapt all or part of the work\n in a fashion requiring copyright permission, other than the making of an\n exact copy. The resulting work is called a \"modified version\" of the\n earlier work or a work \"based on\" the earlier work.\n \n A \"covered work\" means either the unmodified Program or a work based\n on the Program.\n \n To \"propagate\" a work means to do anything with it that, without\n permission, would make you directly or secondarily liable for\n infringement under applicable copyright law, except executing it on a\n computer or modifying a private copy. Propagation includes copying,\n distribution (with or without modification), making available to the\n public, and in some countries other activities as well.\n \n To \"convey\" a work means any kind of propagation that enables other\n parties to make or receive copies. Mere interaction with a user through\n a computer network, with no transfer of a copy, is not conveying.\n \n An interactive user interface displays \"Appropriate Legal Notices\"\n to the extent that it includes a convenient and prominently visible\n feature that (1) displays an appropriate copyright notice, and (2)\n tells the user that there is no warranty for the work (except to the\n extent that warranties are provided), that licensees may convey the\n work under this License, and how to view a copy of this License. If\n the interface presents a list of user commands or options, such as a\n menu, a prominent item in the list meets this criterion.\n \n 1. Source Code.\n \n The \"source code\" for a work means the preferred form of the work\n for making modifications to it. \"Object code\" means any non-source\n form of a work.\n \n A \"Standard Interface\" means an interface that either is an official\n standard defined by a recognized standards body, or, in the case of\n interfaces specified for a particular programming language, one that\n is widely used among developers working in that language.\n \n The \"System Libraries\" of an executable work include anything, other\n than the work as a whole, that (a) is included in the normal form of\n packaging a Major Component, but which is not part of that Major\n Component, and (b) serves only to enable use of the work with that\n Major Component, or to implement a Standard Interface for which an\n implementation is available to the public in source code form. A\n \"Major Component\", in this context, means a major essential component\n (kernel, window system, and so on) of the specific operating system\n (if any) on which the executable work runs, or a compiler used to\n produce the work, or an object code interpreter used to run it.\n \n The \"Corresponding Source\" for a work in object code form means all\n the source code needed to generate, install, and (for an executable\n work) run the object code and to modify the work, including scripts to\n control those activities. However, it does not include the work's\n System Libraries, or general-purpose tools or generally available free\n programs which are used unmodified in performing those activities but\n which are not part of the work. For example, Corresponding Source\n includes interface definition files associated with source files for\n the work, and the source code for shared libraries and dynamically\n linked subprograms that the work is specifically designed to require,\n such as by intimate data communication or control flow between those\n subprograms and other parts of the work.\n \n The Corresponding Source need not include anything that users\n can regenerate automatically from other parts of the Corresponding\n Source.\n \n The Corresponding Source for a work in source code form is that\n same work.\n \n 2. Basic Permissions.\n \n All rights granted under this License are granted for the term of\n copyright on the Program, and are irrevocable provided the stated\n conditions are met. This License explicitly affirms your unlimited\n permission to run the unmodified Program. The output from running a\n covered work is covered by this License only if the output, given its\n content, constitutes a covered work. This License acknowledges your\n rights of fair use or other equivalent, as provided by copyright law.\n \n You may make, run and propagate covered works that you do not\n convey, without conditions so long as your license otherwise remains\n in force. You may convey covered works to others for the sole purpose\n of having them make modifications exclusively for you, or provide you\n with facilities for running those works, provided that you comply with\n the terms of this License in conveying all material for which you do\n not control copyright. Those thus making or running the covered works\n for you must do so exclusively on your behalf, under your direction\n and control, on terms that prohibit them from making any copies of\n your copyrighted material outside their relationship with you.\n \n Conveying under any other circumstances is permitted solely under\n the conditions stated below. Sublicensing is not allowed; section 10\n makes it unnecessary.\n \n 3. Protecting Users' Legal Rights From Anti-Circumvention Law.\n \n No covered work shall be deemed part of an effective technological\n measure under any applicable law fulfilling obligations under article\n 11 of the WIPO copyright treaty adopted on 20 December 1996, or\n similar laws prohibiting or restricting circumvention of such\n measures.\n \n When you convey a covered work, you waive any legal power to forbid\n circumvention of technological measures to the extent such circumvention\n is effected by exercising rights under this License with respect to\n the covered work, and you disclaim any intention to limit operation or\n modification of the work as a means of enforcing, against the work's\n users, your or third parties' legal rights to forbid circumvention of\n technological measures.\n \n 4. Conveying Verbatim Copies.\n \n You may convey verbatim copies of the Program's source code as you\n receive it, in any medium, provided that you conspicuously and\n appropriately publish on each copy an appropriate copyright notice;\n keep intact all notices stating that this License and any\n non-permissive terms added in accord with section 7 apply to the code;\n keep intact all notices of the absence of any warranty; and give all\n recipients a copy of this License along with the Program.\n \n You may charge any price or no price for each copy that you convey,\n and you may offer support or warranty protection for a fee.\n \n 5. Conveying Modified Source Versions.\n \n You may convey a work based on the Program, or the modifications to\n produce it from the Program, in the form of source code under the\n terms of section 4, provided that you also meet all of these conditions:\n \n a) The work must carry prominent notices stating that you modified\n it, and giving a relevant date.\n \n b) The work must carry prominent notices stating that it is\n released under this License and any conditions added under section\n 7. This requirement modifies the requirement in section 4 to\n \"keep intact all notices\".\n \n c) You must license the entire work, as a whole, under this\n License to anyone who comes into possession of a copy. This\n License will therefore apply, along with any applicable section 7\n additional terms, to the whole of the work, and all its parts,\n regardless of how they are packaged. This License gives no\n permission to license the work in any other way, but it does not\n invalidate such permission if you have separately received it.\n \n d) If the work has interactive user interfaces, each must display\n Appropriate Legal Notices; however, if the Program has interactive\n interfaces that do not display Appropriate Legal Notices, your\n work need not make them do so.\n \n A compilation of a covered work with other separate and independent\n works, which are not by their nature extensions of the covered work,\n and which are not combined with it such as to form a larger program,\n in or on a volume of a storage or distribution medium, is called an\n \"aggregate\" if the compilation and its resulting copyright are not\n used to limit the access or legal rights of the compilation's users\n beyond what the individual works permit. Inclusion of a covered work\n in an aggregate does not cause this License to apply to the other\n parts of the aggregate.\n \n 6. Conveying Non-Source Forms.\n \n You may convey a covered work in object code form under the terms\n of sections 4 and 5, provided that you also convey the\n machine-readable Corresponding Source under the terms of this License,\n in one of these ways:\n \n a) Convey the object code in, or embodied in, a physical product\n (including a physical distribution medium), accompanied by the\n Corresponding Source fixed on a durable physical medium\n customarily used for software interchange.\n \n b) Convey the object code in, or embodied in, a physical product\n (including a physical distribution medium), accompanied by a\n written offer, valid for at least three years and valid for as\n long as you offer spare parts or customer support for that product\n model, to give anyone who possesses the object code either (1) a\n copy of the Corresponding Source for all the software in the\n product that is covered by this License, on a durable physical\n medium customarily used for software interchange, for a price no\n more than your reasonable cost of physically performing this\n conveying of source, or (2) access to copy the\n Corresponding Source from a network server at no charge.\n \n c) Convey individual copies of the object code with a copy of the\n written offer to provide the Corresponding Source. This\n alternative is allowed only occasionally and noncommercially, and\n only if you received the object code with such an offer, in accord\n with subsection 6b.\n \n d) Convey the object code by offering access from a designated\n place (gratis or for a charge), and offer equivalent access to the\n Corresponding Source in the same way through the same place at no\n further charge. You need not require recipients to copy the\n Corresponding Source along with the object code. If the place to\n copy the object code is a network server, the Corresponding Source\n may be on a different server (operated by you or a third party)\n that supports equivalent copying facilities, provided you maintain\n clear directions next to the object code saying where to find the\n Corresponding Source. Regardless of what server hosts the\n Corresponding Source, you remain obligated to ensure that it is\n available for as long as needed to satisfy these requirements.\n \n e) Convey the object code using peer-to-peer transmission, provided\n you inform other peers where the object code and Corresponding\n Source of the work are being offered to the general public at no\n charge under subsection 6d.\n \n A separable portion of the object code, whose source code is excluded\n from the Corresponding Source as a System Library, need not be\n included in conveying the object code work.\n \n A \"User Product\" is either (1) a \"consumer product\", which means any\n tangible personal property which is normally used for personal, family,\n or household purposes, or (2) anything designed or sold for incorporation\n into a dwelling. In determining whether a product is a consumer product,\n doubtful cases shall be resolved in favor of coverage. For a particular\n product received by a particular user, \"normally used\" refers to a\n typical or common use of that class of product, regardless of the status\n of the particular user or of the way in which the particular user\n actually uses, or expects or is expected to use, the product. A product\n is a consumer product regardless of whether the product has substantial\n commercial, industrial or non-consumer uses, unless such uses represent\n the only significant mode of use of the product.\n \n \"Installation Information\" for a User Product means any methods,\n procedures, authorization keys, or other information required to install\n and execute modified versions of a covered work in that User Product from\n a modified version of its Corresponding Source. The information must\n suffice to ensure that the continued functioning of the modified object\n code is in no case prevented or interfered with solely because\n modification has been made.\n \n If you convey an object code work under this section in, or with, or\n specifically for use in, a User Product, and the conveying occurs as\n part of a transaction in which the right of possession and use of the\n User Product is transferred to the recipient in perpetuity or for a\n fixed term (regardless of how the transaction is characterized), the\n Corresponding Source conveyed under this section must be accompanied\n by the Installation Information. But this requirement does not apply\n if neither you nor any third party retains the ability to install\n modified object code on the User Product (for example, the work has\n been installed in ROM).\n \n The requirement to provide Installation Information does not include a\n requirement to continue to provide support service, warranty, or updates\n for a work that has been modified or installed by the recipient, or for\n the User Product in which it has been modified or installed. Access to a\n network may be denied when the modification itself materially and\n adversely affects the operation of the network or violates the rules and\n protocols for communication across the network.\n \n Corresponding Source conveyed, and Installation Information provided,\n in accord with this section must be in a format that is publicly\n documented (and with an implementation available to the public in\n source code form), and must require no special password or key for\n unpacking, reading or copying.\n \n 7. Additional Terms.\n \n \"Additional permissions\" are terms that supplement the terms of this\n License by making exceptions from one or more of its conditions.\n Additional permissions that are applicable to the entire Program shall\n be treated as though they were included in this License, to the extent\n that they are valid under applicable law. If additional permissions\n apply only to part of the Program, that part may be used separately\n under those permissions, but the entire Program remains governed by\n this License without regard to the additional permissions.\n \n When you convey a copy of a covered work, you may at your option\n remove any additional permissions from that copy, or from any part of\n it. (Additional permissions may be written to require their own\n removal in certain cases when you modify the work.) You may place\n additional permissions on material, added by you to a covered work,\n for which you have or can give appropriate copyright permission.\n \n Notwithstanding any other provision of this License, for material you\n add to a covered work, you may (if authorized by the copyright holders of\n that material) supplement the terms of this License with terms:\n \n a) Disclaiming warranty or limiting liability differently from the\n terms of sections 15 and 16 of this License; or\n \n b) Requiring preservation of specified reasonable legal notices or\n author attributions in that material or in the Appropriate Legal\n Notices displayed by works containing it; or\n \n c) Prohibiting misrepresentation of the origin of that material, or\n requiring that modified versions of such material be marked in\n reasonable ways as different from the original version; or\n \n d) Limiting the use for publicity purposes of names of licensors or\n authors of the material; or\n \n e) Declining to grant rights under trademark law for use of some\n trade names, trademarks, or service marks; or\n \n f) Requiring indemnification of licensors and authors of that\n material by anyone who conveys the material (or modified versions of\n it) with contractual assumptions of liability to the recipient, for\n any liability that these contractual assumptions directly impose on\n those licensors and authors.\n \n All other non-permissive additional terms are considered \"further\n restrictions\" within the meaning of section 10. If the Program as you\n received it, or any part of it, contains a notice stating that it is\n governed by this License along with a term that is a further\n restriction, you may remove that term. If a license document contains\n a further restriction but permits relicensing or conveying under this\n License, you may add to a covered work material governed by the terms\n of that license document, provided that the further restriction does\n not survive such relicensing or conveying.\n \n If you add terms to a covered work in accord with this section, you\n must place, in the relevant source files, a statement of the\n additional terms that apply to those files, or a notice indicating\n where to find the applicable terms.\n \n Additional terms, permissive or non-permissive, may be stated in the\n form of a separately written license, or stated as exceptions;\n the above requirements apply either way.\n \n 8. Termination.\n \n You may not propagate or modify a covered work except as expressly\n provided under this License. Any attempt otherwise to propagate or\n modify it is void, and will automatically terminate your rights under\n this License (including any patent licenses granted under the third\n paragraph of section 11).\n \n However, if you cease all violation of this License, then your\n license from a particular copyright holder is reinstated (a)\n provisionally, unless and until the copyright holder explicitly and\n finally terminates your license, and (b) permanently, if the copyright\n holder fails to notify you of the violation by some reasonable means\n prior to 60 days after the cessation.\n \n Moreover, your license from a particular copyright holder is\n reinstated permanently if the copyright holder notifies you of the\n violation by some reasonable means, this is the first time you have\n received notice of violation of this License (for any work) from that\n copyright holder, and you cure the violation prior to 30 days after\n your receipt of the notice.\n \n Termination of your rights under this section does not terminate the\n licenses of parties who have received copies or rights from you under\n this License. If your rights have been terminated and not permanently\n reinstated, you do not qualify to receive new licenses for the same\n material under section 10.\n \n 9. Acceptance Not Required for Having Copies.\n \n You are not required to accept this License in order to receive or\n run a copy of the Program. Ancillary propagation of a covered work\n occurring solely as a consequence of using peer-to-peer transmission\n to receive a copy likewise does not require acceptance. However,\n nothing other than this License grants you permission to propagate or\n modify any covered work. These actions infringe copyright if you do\n not accept this License. Therefore, by modifying or propagating a\n covered work, you indicate your acceptance of this License to do so.\n \n 10. Automatic Licensing of Downstream Recipients.\n \n Each time you convey a covered work, the recipient automatically\n receives a license from the original licensors, to run, modify and\n propagate that work, subject to this License. You are not responsible\n for enforcing compliance by third parties with this License.\n \n An \"entity transaction\" is a transaction transferring control of an\n organization, or substantially all assets of one, or subdividing an\n organization, or merging organizations. If propagation of a covered\n work results from an entity transaction, each party to that\n transaction who receives a copy of the work also receives whatever\n licenses to the work the party's predecessor in interest had or could\n give under the previous paragraph, plus a right to possession of the\n Corresponding Source of the work from the predecessor in interest, if\n the predecessor has it or can get it with reasonable efforts.\n \n You may not impose any further restrictions on the exercise of the\n rights granted or affirmed under this License. For example, you may\n not impose a license fee, royalty, or other charge for exercise of\n rights granted under this License, and you may not initiate litigation\n (including a cross-claim or counterclaim in a lawsuit) alleging that\n any patent claim is infringed by making, using, selling, offering for\n sale, or importing the Program or any portion of it.\n \n 11. Patents.\n \n A \"contributor\" is a copyright holder who authorizes use under this\n License of the Program or a work on which the Program is based. The\n work thus licensed is called the contributor's \"contributor version\".\n \n A contributor's \"essential patent claims\" are all patent claims\n owned or controlled by the contributor, whether already acquired or\n hereafter acquired, that would be infringed by some manner, permitted\n by this License, of making, using, or selling its contributor version,\n but do not include claims that would be infringed only as a\n consequence of further modification of the contributor version. For\n purposes of this definition, \"control\" includes the right to grant\n patent sublicenses in a manner consistent with the requirements of\n this License.\n \n Each contributor grants you a non-exclusive, worldwide, royalty-free\n patent license under the contributor's essential patent claims, to\n make, use, sell, offer for sale, import and otherwise run, modify and\n propagate the contents of its contributor version.\n \n In the following three paragraphs, a \"patent license\" is any express\n agreement or commitment, however denominated, not to enforce a patent\n (such as an express permission to practice a patent or covenant not to\n sue for patent infringement). To \"grant\" such a patent license to a\n party means to make such an agreement or commitment not to enforce a\n patent against the party.\n \n If you convey a covered work, knowingly relying on a patent license,\n and the Corresponding Source of the work is not available for anyone\n to copy, free of charge and under the terms of this License, through a\n publicly available network server or other readily accessible means,\n then you must either (1) cause the Corresponding Source to be so\n available, or (2) arrange to deprive yourself of the benefit of the\n patent license for this particular work, or (3) arrange, in a manner\n consistent with the requirements of this License, to extend the patent\n license to downstream recipients. \"Knowingly relying\" means you have\n actual knowledge that, but for the patent license, your conveying the\n covered work in a country, or your recipient's use of the covered work\n in a country, would infringe one or more identifiable patents in that\n country that you have reason to believe are valid.\n \n If, pursuant to or in connection with a single transaction or\n arrangement, you convey, or propagate by procuring conveyance of, a\n covered work, and grant a patent license to some of the parties\n receiving the covered work authorizing them to use, propagate, modify\n or convey a specific copy of the covered work, then the patent license\n you grant is automatically extended to all recipients of the covered\n work and works based on it.\n \n A patent license is \"discriminatory\" if it does not include within\n the scope of its coverage, prohibits the exercise of, or is\n conditioned on the non-exercise of one or more of the rights that are\n specifically granted under this License. You may not convey a covered\n work if you are a party to an arrangement with a third party that is\n in the business of distributing software, under which you make payment\n to the third party based on the extent of your activity of conveying\n the work, and under which the third party grants, to any of the\n parties who would receive the covered work from you, a discriminatory\n patent license (a) in connection with copies of the covered work\n conveyed by you (or copies made from those copies), or (b) primarily\n for and in connection with specific products or compilations that\n contain the covered work, unless you entered into that arrangement,\n or that patent license was granted, prior to 28 March 2007.\n \n Nothing in this License shall be construed as excluding or limiting\n any implied license or other defenses to infringement that may\n otherwise be available to you under applicable patent law.\n \n 12. No Surrender of Others' Freedom.\n \n If conditions are imposed on you (whether by court order, agreement or\n otherwise) that contradict the conditions of this License, they do not\n excuse you from the conditions of this License. If you cannot convey a\n covered work so as to satisfy simultaneously your obligations under this\n License and any other pertinent obligations, then as a consequence you may\n not convey it at all. For example, if you agree to terms that obligate you\n to collect a royalty for further conveying from those to whom you convey\n the Program, the only way you could satisfy both those terms and this\n License would be to refrain entirely from conveying the Program.\n \n 13. Use with the GNU Affero General Public License.\n \n Notwithstanding any other provision of this License, you have\n permission to link or combine any covered work with a work licensed\n under version 3 of the GNU Affero General Public License into a single\n combined work, and to convey the resulting work. The terms of this\n License will continue to apply to the part which is the covered work,\n but the special requirements of the GNU Affero General Public License,\n section 13, concerning interaction through a network will apply to the\n combination as such.\n \n 14. Revised Versions of this License.\n \n The Free Software Foundation may publish revised and/or new versions of\n the GNU General Public License from time to time. Such new versions will\n be similar in spirit to the present version, but may differ in detail to\n address new problems or concerns.\n \n Each version is given a distinguishing version number. If the\n Program specifies that a certain numbered version of the GNU General\n Public License \"or any later version\" applies to it, you have the\n option of following the terms and conditions either of that numbered\n version or of any later version published by the Free Software\n Foundation. If the Program does not specify a version number of the\n GNU General Public License, you may choose any version ever published\n by the Free Software Foundation.\n \n If the Program specifies that a proxy can decide which future\n versions of the GNU General Public License can be used, that proxy's\n public statement of acceptance of a version permanently authorizes you\n to choose that version for the Program.\n \n Later license versions may give you additional or different\n permissions. However, no additional obligations are imposed on any\n author or copyright holder as a result of your choosing to follow a\n later version.\n \n 15. Disclaimer of Warranty.\n \n THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY\n APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT\n HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY\n OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,\n THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\n PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM\n IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF\n ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n \n 16. Limitation of Liability.\n \n IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\n WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS\n THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY\n GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE\n USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF\n DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD\n PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),\n EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF\n SUCH DAMAGES.\n \n 17. Interpretation of Sections 15 and 16.\n \n If the disclaimer of warranty and limitation of liability provided\n above cannot be given local legal effect according to their terms,\n reviewing courts shall apply local law that most closely approximates\n an absolute waiver of all civil liability in connection with the\n Program, unless a warranty or assumption of liability accompanies a\n copy of the Program in return for a fee.\n \n END OF TERMS AND CONDITIONS\n \n How to Apply These Terms to Your New Programs\n \n If you develop a new program, and you want it to be of the greatest\n possible use to the public, the best way to achieve this is to make it\n free software which everyone can redistribute and change under these terms.\n \n To do so, attach the following notices to the program. It is safest\n to attach them to the start of each source file to most effectively\n state the exclusion of warranty; and each file should have at least\n the \"copyright\" line and a pointer to where the full notice is found.\n \n <one line to give the program's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n \n This program is free software: you can redistribute it and/or modify\n it under the terms of the GNU General Public License as published by\n the Free Software Foundation, either version 3 of the License, or\n (at your option) any later version.\n \n This program is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n GNU General Public License for more details.\n \n You should have received a copy of the GNU General Public License\n along with this program. If not, see <https://www.gnu.org/licenses/>.\n \n Also add information on how to contact you by electronic and paper mail.\n \n If the program does terminal interaction, make it output a short\n notice like this when it starts in an interactive mode:\n \n <program> Copyright (C) <year> <name of author>\n This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\n This is free software, and you are welcome to redistribute it\n under certain conditions; type `show c' for details.\n \n The hypothetical commands `show w' and `show c' should show the appropriate\n parts of the General Public License. Of course, your program's commands\n might be different; for a GUI interface, you would use an \"about box\".\n \n You should also get your employer (if you work as a programmer) or school,\n if any, to sign a \"copyright disclaimer\" for the program, if necessary.\n For more information on this, and how to apply and follow the GNU GPL, see\n <https://www.gnu.org/licenses/>.\n \n The GNU General Public License does not permit incorporating your program\n into proprietary programs. If your program is a subroutine library, you\n may consider it more useful to permit linking proprietary applications with\n the library. If this is what you want to do, use the GNU Lesser General\n Public License instead of this License. But first, please read\n <https://www.gnu.org/licenses/why-not-lgpl.html>.",
"summary": "Provides tools to acquire, manage, and preprocess scientific data in the Sun (NeuroAI) lab.",
"version": "3.0.0",
"project_urls": {
"Documentation": "https://sl-experiment-api-docs.netlify.app/",
"Homepage": "https://github.com/Sun-Lab-NBB/sl-experiment"
},
"split_keywords": [
"acquisition",
" ataraxis",
" data",
" experiment",
" interface",
" mesoscope",
" sunlab"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "7a107683c44ab0c4af11aecd2c5d5688d565ee40a9197f2739ad2519d71cde7e",
"md5": "783acf5c3dd863b64b2763b49b4db4da",
"sha256": "4a8f195d2f658295734b3999ac179ec9fe6169064775290d826ed46cbbe52ab9"
},
"downloads": -1,
"filename": "sl_experiment-3.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "783acf5c3dd863b64b2763b49b4db4da",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.12,>=3.11",
"size": 269421,
"upload_time": "2025-07-20T19:28:26",
"upload_time_iso_8601": "2025-07-20T19:28:26.028995Z",
"url": "https://files.pythonhosted.org/packages/7a/10/7683c44ab0c4af11aecd2c5d5688d565ee40a9197f2739ad2519d71cde7e/sl_experiment-3.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "f109c267976daa41e0034f66e62da6174d7be5c65c6237a6e3b6b762d1df0bfe",
"md5": "31ac51120f1a4a526c196bb4b8a76e14",
"sha256": "ea367c621df4ce21408e690cc0de7f3b4451d7e69c4c235028e612d10c783dec"
},
"downloads": -1,
"filename": "sl_experiment-3.0.0.tar.gz",
"has_sig": false,
"md5_digest": "31ac51120f1a4a526c196bb4b8a76e14",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.12,>=3.11",
"size": 276759,
"upload_time": "2025-07-20T19:28:27",
"upload_time_iso_8601": "2025-07-20T19:28:27.287557Z",
"url": "https://files.pythonhosted.org/packages/f1/09/c267976daa41e0034f66e62da6174d7be5c65c6237a6e3b6b762d1df0bfe/sl_experiment-3.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-20 19:28:27",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Sun-Lab-NBB",
"github_project": "sl-experiment",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"tox": true,
"lcname": "sl-experiment"
}