collective.exportimport


Namecollective.exportimport JSON
Version 1.12 PyPI version JSON
download
home_pagehttps://github.com/collective/collective.exportimport
SummaryAn add-on for Plone to Export and import content, members, relations, translations and localroles.
upload_time2024-03-08 10:37:50
maintainer
docs_urlNone
authorPhilip Bauer (for starzel.de)
requires_python>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*
licenseGPL version 2
keywords python plone cms
VCS
bugtrack_url
requirements zc.buildout setuptools
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. This README is meant for consumption by humans and PyPI. PyPI can render reStructuredText files, so please do not use Sphinx features.
   If you want to learn more about writing documentation, please check out: https://6.docs.plone.org/contributing/documentation/
   This text does not appear on PyPI or GitHub. It is a comment.

.. image:: https://img.shields.io/pypi/v/collective.exportimport.svg
    :target: https://pypi.org/project/collective.exportimport/
    :alt: Latest Version

.. image:: https://img.shields.io/pypi/status/collective.exportimport.svg
    :target: https://pypi.org/project/collective.exportimport/
    :alt: Egg Status

.. image:: https://img.shields.io/pypi/pyversions/collective.exportimport.svg?style=plastic   :alt: Supported - Python Versions

.. image:: https://img.shields.io/pypi/l/collective.exportimport.svg
    :target: https://pypi.org/project/collective.exportimport/
    :alt: License


=======================
collective.exportimport
=======================

Export and import content, members, relations, translations, localroles and much more.

Export and import all kinds of data from and to Plone sites using a intermediate json-format.
The main use-case is migrations since it enables you to for example migrate from Plone 4 with Archetypes and Python 2 to Plone 6 with Dexterity and Python 3 in one step.
Most features use `plone.restapi` to serialize and deserialize data.

See also the training on migrating with ``exportimport``: https://training.plone.org/migrations/exportimport.html

.. contents:: Contents
    :local:

Features
========

* Export & Import content
* Export & Import members and groups with their roles
* Export & Import relations
* Export & Import translations
* Export & Import local roles
* Export & Import order (position in parent)
* Export & Import discussions/comments
* Export & Import versioned content
* Export & Import redirects

Export supports:

* Plone 4, 5 and 6
* Archetypes and Dexterity
* Python 2 and 3
* plone.app.multilingual, Products.LinguaPlone, raptus.multilanguagefields

Import supports:

* Plone 5.2+, Dexterity, Python 2 and 3, plone.app.multilingual


Installation
============

Install collective.exportimport as you would install any other Python package.

You don't need to activate the add-on in the Site Setup Add-ons control panel to be able to use the forms ``@@export_content`` and ``@@import_content`` in your site.

If you need help, see:
- for Plone 4: https://4.docs.plone.org/adapt-and-extend/install_add_ons.html
- for Plone 5: https://5.docs.plone.org/manage/installing/installing_addons.html
- for Plone 6: https://6.docs.plone.org/install/manage-add-ons-packages.html


Python 2 compatibility
----------------------

This package is compatible with Python 3 and Python 2.
Depending on the Python version different versions of it's dependencies will be installed.
If you run into problems, file an issue at: https://github.com/collective/collective.exportimport/issues


Usage
=====

Export
------

Use the form with the URL ``/@@export_content``, and select what you want to export:

.. image:: ./docs/export.png

You can export one or more types and a whole site or only a specific path in a site. Since items are exported ordered by path importing them will create the same structure as you had originally.

The downloaded json-file will have the name of the path you exported from, e.g. ``Plone.json``.

The exports for members, relations, localroles and relations are linked to in this form but can also be called individually: ``/@@export_members``, ``/@@export_relations``, ``/@@export_localroles``, ``/@@export_translations``, ``/@@export_ordering``, ``/@@export_discussion``.


Import
------

Use the form with the URL ``/@@import_content``, and upload a json-file that you want to import:

.. image:: ./docs/import.png


The imports for members, relations, localroles and relations are linked to in this form but can also be called individually: ``/@@import_members``, ``/@@import_relations``, ``/@@import_localroles``, ``/@@import_translations``, ``/@@import_ordering``, ``/@@import_discussion``.

As a last step in a migration there is another view ``@@reset_dates`` that resets the modified date on imported content to the date initially contained in the imported json-file. This is necessary since varous changes during a migration will likely result in a updated modified-date. During import the original is stored as ``obj.modification_date_migrated`` on each new object and this view sets this date.

Export- and import locations
----------------------------

If you select 'Save to file on server', the Export view will save json files in the <var> directory of your Plone instanc in /var/instance.
The import view will look for  files under /var/instance/import.
These directories will normally be different, under different Plone instances and possibly on different servers.

You can set the environment variable 'COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY' to add a 'shared' directory on one server or maybe network share.
With this variable set, collective.exportimport will both save to and load .json files from the same server directory.
This saves time not having to move .json files around from the export- to the import location.
You should be aware that the Export views will overwrite any existing previous .json file export that have the same name.


Use-cases
=========

Migrations
----------

When a in-place-migration is not required you can choose this add-on to migrate the most important parts of your site to json and then import it into a new Plone instance of your targeted version:

* Export content from a Plone site (it supports Plone 4 and 5, Archetypes and Dexterity, Python 2 and 3).
* Import the exported content into a new site (Plone 5.2+, Dexterity, Python 3)
* Export and import relations, users and groups with their roles, translations, local roles, ordering, default-pages, comments, portlets and redirects.

How to migrate additional features like Annotations or Marker Interfaces is discussed in the FAQ section.

Other
-----

You can use this add-on to

* Archive your content as JSON.
* Export data to prepare a migration to another system.
* Combine content from multiple plone-sites into one.
* Import a plone-site as a subsite into another.
* Import content from other systems as long as it fits the required format.
* Update or replace existing data.

Details
=======

Export content
--------------

Exporting content is basically a wrapper for the serializers of plone.restapi:

.. code-block:: python

    from plone.restapi.interfaces import ISerializeToJson
    from zope.component import getMultiAdapter

    serializer = getMultiAdapter((obj, request), ISerializeToJson)
    data = serializer(include_items=False)

Import content
--------------

Importing content is a elaborate wrapper for the deserializers of plone.restapi:

.. code-block:: python

    from plone.restapi.interfaces import IDeserializeFromJson
    from zope.component import getMultiAdapter

    container.invokeFactory(item['@type'], item['id'])
    deserializer = getMultiAdapter((new, self.request), IDeserializeFromJson)
    new = deserializer(validate_all=False, data=item)


Use for migrations
------------------

A main use-case of this package is migration from one Plone-Version to another.

Exporting Archetypes content and importing that as Dexterity content works fine but due to changes in field-names some settings would get lost.
For example the setting to exclude content from the navigation was renamed from ``excludeFromNav`` to ``exclude_from_nav``.

To fix this you can check the checkbox "Modify exported data for migrations".
This will modify the data during export:

* Drop unused data (e.g. `next_item` and `components`)
* Remove all relation fields
* Change some field names that changed between Archetypes and Dexterity

  * ``excludeFromNav`` → ``exclude_from_nav``
  * ``allowDiscussion`` → ``allow_discussion``
  * ``subject`` → ``subjects``
  * ``expirationDate`` → ``expires``
  * ``effectiveDate`` → ``effective``
  * ``creation_date`` → ``created``
  * ``modification_date`` → ``modified``
  * ``startDate`` → ``start``
  * ``endDate`` → ``end``
  * ``openEnd`` → ``open_end``
  * ``wholeDay`` → ``whole_day``
  * ``contactEmail`` → ``contact_email``
  * ``contactName`` → ``contact_name``
  * ``contactPhone`` → ``contact_phone``

* Update view names on Folders and Collection that changed since Plone 4.
* Export ``ATTopic`` and their criteria to Collections with querystrings.
* Update Collection-criteria.
* Links and images in Richtext-Fields of content and portlets have changes since Plone 4.
  the view ``/@@fix_html`` allows you to fix these.


Control creating imported content
---------------------------------

You can choose between four options how to deal with content that already exists:

  * Skip: Don't import at all
  * Replace: Delete item and create new
  * Update: Reuse and only overwrite imported data
  * Ignore: Create with a new id

Imported content is initially created with ``invokeFactory`` using portal_type and id of the exported item before deserializing the rest of the data.
You can set additional values by specifying a dict ``factory_kwargs`` that will be passed to the factory.
Like this you can set values on the imported object that are expected to be there by subscribers to IObjectAddedEvent.


Export versioned content
------------------------

Exporting versions of Archetypes content will not work because of a bug in plone.restapi (https://github.com/plone/plone.restapi/issues/1335).
For export to work you need to use a version between 7.7.0 and 8.0.0 (if released) or a source-checkout of the branch 7.x.x.


Notes on speed and large migrations
===================================

Exporting and importing large amounts of content can take a while. Export is pretty fast but import is constrained by some features of Plone, most importantly versioning:

* Importing 5000 Folders takes ~5 minutes
* Importing 5000 Documents takes >25 minutes because of versioning.
* Importing 5000 Documents without versioning takes ~7 minutes.

During import you can commit every x number of items which will free up memory and disk-space in your TMPDIR (where blobs are added before each commit).

When exporting large numbers of blobs (binary files and images) you will get huge json-files and may run out of memory.
You have various options to deal with this.
The best way depends on how you are going to import the blobs:

- Export as download urls: small download, but ``collective.exportimport`` cannot import the blobs, so you will need an own import script to download them.
- Export as base-64 encoded strings: large download, but ``collective.exportimport`` can handle the import.
- Export as blob paths: small download and ``collective.exportimport`` can handle the import, but you need to copy ``var/blobstorage`` to the Plone Site where you do the import or set the environment variable ``COLLECTIVE_EXPORTIMPORT_BLOB_HOME`` to the old blobstorage path: ``export COLLECTIVE_EXPORTIMPORT_BLOB_HOME=/path-to-old-instance/var/blobstorage``.
  To export the blob-path you do not need to have access to the blobs!


Format of export and import of content
======================================

By default all content is exported to and imported from one large json-file.
To inspect such very large json-files without performance-issues you can use klogg (https://klogg.filimonov.dev).

Since version 1.10 collective.exportimport also supports exporting and importing each content item as a separate json-file.
To use that select *Save each item as a separate file on the server* in the form or specify ``download_to_server=2`` when calling the export in python.
In the import-form you can manually select a directory on the server or specify ``server_directory="/mydir"`` when calling the import in python.


Customize export and import
===========================

This add-on is designed to be adapted to your requirements and has multiple hooks to make that easy.

To make that easier here are packages you can reuse to override and extend the export and import.
Use these templates and adapt them to your own projects:

* https://github.com/starzel/contentexport
* https://github.com/starzel/contentimport

Many examples for customizing the export and import are collected in the chapter "FAQ, Tips and Tricks" below.

.. note::

    As a rule of thumb you should make changes to the data during import unless you need access to the original object for the required changes.
    One reason is that this way the serialized content in the json-file more closely represents the original data.
    Another reason is that it allows you to fix issues during the process you are currently developing (i.e. without having to redo the export).


Export Example
--------------

.. code-block:: python

    from collective.exportimport.export_content import ExportContent

    class CustomExportContent(ExportContent):

        QUERY = {
            'Document': {'review_state': ['published', 'pending']},
        }

        DROP_PATHS = [
            '/Plone/userportal',
            '/Plone/en/obsolete_content',
        ]

        DROP_UIDS = [
            '71e3e0a6f06942fea36536fbed0f6c42',
        ]

        def update(self):
            """Use this to override stuff before the export starts
            (e.g. force a specific language in the request)."""

        def start(self):
            """Hook to do something before export."""

        def finish(self):
            """Hook to do something after export."""

        def global_obj_hook(self, obj):
            """Inspect the content item before serialisation data.
            Bad: Changing the content-item is a horrible idea.
            Good: Return None if you want to skip this particular object.
            """
            return obj

        def global_dict_hook(self, item, obj):
            """Use this to modify or skip the serialized data.
            Return None if you want to skip this particular object.
            """
            return item

        def dict_hook_document(self, item, obj):
            """Use this to modify or skip the serialized data by type.
            Return the modified dict (item) or None if you want to skip this particular object.
            """
            return item


Register it with your own browserlayer to override the default.

.. code-block:: text

  <browser:page
      name="export_content"
      for="zope.interface.Interface"
      class=".custom_export.CustomExportContent"
      layer="My.Custom.IBrowserlayer"
      permission="cmf.ManagePortal"
      />


Import Example
--------------

.. code-block:: python

    from collective.exportimport.import_content import ImportContent

    class CustomImportContent(ImportContent):

        CONTAINER = {'Event': '/imported-events'}

        # These fields will be ignored
        DROP_FIELDS = ['relatedItems']

        # Items with these uid will be ignored
        DROP_UIDS = ['04d1477583c74552a7fcd81a9085c620']

        # These paths will be ignored
        DROP_PATHS = ['/Plone/doormat/', '/Plone/import_files/']

        # Default values for some fields
        DEFAULTS = {'which_price': 'normal'}

        def start(self):
            """Hook to do something before importing one file."""

        def finish(self):
            """Hook to do something after importing one file."""

        def global_dict_hook(self, item):
            if isinstance(item.get('description', None), dict):
                item['description'] = item['description']['data']
            if isinstance(item.get('rights', None), dict):
                item['rights'] = item['rights']['data']
            return item

        def dict_hook_customtype(self, item):
            # change the type
            item['@type'] = 'anothertype'
            # drop a field
            item.pop('experiences', None)
            return item

        def handle_file_container(self, item):
            """Use this to specify the container in which to create the item in.
            Return the container for this particular object.
            """
            return self.portal['imported_files']

Register it:

.. code-block:: text

  <browser:page
      name="import_content"
      for="zope.interface.Interface"
      class=".custom_import.CustomImportContent"
      layer="My.Custom.IBrowserlayer"
      permission="cmf.ManagePortal"
      />


Automate export and import
--------------------------

Run all exports and save all data in ``var/instance/``:

.. code-block:: python

    from plone import api
    from Products.Five import BrowserView

    class ExportAll(BrowserView):

        def __call__(self):
            export_content = api.content.get_view("export_content", self.context, self.request)
            self.request.form["form.submitted"] = True
            export_content(
                portal_type=["Folder", "Document", "News Item", "File", "Image"],  # only export these
                include_blobs=2,  # Export files and images as blob paths
                download_to_server=True)

            other_exports = [
                "export_relations",
                "export_members",
                "export_translations",
                "export_localroles",
                "export_ordering",
                "export_defaultpages",
                "export_discussion",
                "export_portlets",
                "export_redirects",
            ]
            for name in other_exports:
                view = api.content.get_view(name, portal, request)
                # This saves each export in var/instance/export_xxx.json
                view(download_to_server=True)

            # Important! Redirect to prevent infinite export loop :)
            return self.request.response.redirect(self.context.absolute_url())

Run all imports using the data exported in the example above:

.. code-block:: python

    from collective.exportimport.fix_html import fix_html_in_content_fields
    from collective.exportimport.fix_html import fix_html_in_portlets
    from pathlib import Path
    from plone import api
    from Products.Five import BrowserView


    class ImportAll(BrowserView):

        def __call__(self):
            portal = api.portal.get()

            # Import content
            view = api.content.get_view("import_content", portal, request)
            request.form["form.submitted"] = True
            request.form["commit"] = 500
            view(server_file="Plone.json", return_json=True)
            transaction.commit()

            # Run all other imports
            other_imports = [
                "relations",
                "members",
                "translations",
                "localroles",
                "ordering",
                "defaultpages",
                "discussion",
                "portlets",
                "redirects",
            ]
            cfg = getConfiguration()
            directory = Path(cfg.clienthome) / "import"
            for name in other_imports:
                view = api.content.get_view(f"import_{name}", portal, request)
                path = Path(directory) / f"export_{name}.json"
                results = view(jsonfile=path.read_text(), return_json=True)
                logger.info(results)
                transaction.commit()

            # Run cleanup steps
            results = fix_html_in_content_fields()
            logger.info("Fixed html for %s content items", results)
            transaction.commit()

            results = fix_html_in_portlets()
            logger.info("Fixed html for %s portlets", results)
            transaction.commit()

            reset_dates = api.content.get_view("reset_dates", portal, request)
            reset_dates()
            transaction.commit()

.. note::

    The views ``@@export_all`` and ``@@import_all`` are also contained in the helper-packages https://github.com/starzel/contentexport and https://github.com/starzel/contentimport

FAQ, Tips and Tricks
====================

This section covers frequent use-cases and examples for features that are not required for all migrations.

Using global_obj_hook during export
-----------------------------------

Using ``global_obj_hook`` during export to inspect content and decide to skip it.

.. code-block:: python

    def global_obj_hook(self, obj):
        # Drop subtopics
        if obj.portal_type == "Topic" and obj.__parent__.portal_type == "Topic":
            return

        # Drop files and images from PFG formfolders
        if obj.__parent__.portal_type == "FormFolder":
            return
        return obj


Using dict-hooks during export
------------------------------

Use ``global_dict_hook`` during export to inspect content and modify the serialized json.
You can also use ``dict_hook_<somecontenttype>`` to better structure your code for readability.

Sometimes you need to handle data that you add in ``global_dict_hook`` during export in corresponding code in ``global_object_hook`` during import.

The following example about placeful workflow policy is a perfect example for that pattern:


Export/Import placeful workflow policy
--------------------------------------

Export:

.. code-block:: python

    def global_dict_hook(self, item, obj):
        if obj.isPrincipiaFolderish and ".wf_policy_config" in obj.keys():
            wf_policy = obj[".wf_policy_config"]
            item["exportimport.workflow_policy"] = {
                "workflow_policy_below": wf_policy.workflow_policy_below,
                "workflow_policy_in": wf_policy.workflow_policy_in,
            }
        return item

Import:

.. code-block:: python

    def global_obj_hook(self, obj, item):
        wf_policy = item.get("exportimport.workflow_policy")
        if wf_policy:
            obj.manage_addProduct["CMFPlacefulWorkflow"].manage_addWorkflowPolicyConfig()
            wf_policy_config = obj[".wf_policy_config"]
            wf_policy_config.setPolicyIn(wf_policy["workflow_policy_in"], update_security=True)
            wf_policy_config.setPolicyBelow(wf_policy["workflow_policy_below"], update_security=True)


Using dict-hooks during import
------------------------------

A lot of fixes can be done during import using the ``global_dict_hook`` or ``dict_hook_<contenttype>``.

Here we prevent the expire-date to be before the effective date since that would lead to validation-errors during deserializing:

.. code-block:: python

    def global_dict_hook(self, item):
        effective = item.get('effective', None)
        expires = item.get('expires', None)
        if effective and expires and expires <= effective:
            item.pop('expires')
        return item

Here we drop empty lines from the creators:

.. code-block:: python

    def global_dict_hook(self, item):
        item["creators"] = [i for i in item.get("creators", []) if i]
        return item

This example migrates a ``PloneHelpCenter`` to a simple folder/document structure during import.
There are a couple more types to handle (as folder or document) but you get the idea, don't you?

.. code-block:: python

    def dict_hook_helpcenter(self, item):
        item["@type"] = "Folder"
        item["layout"] = "listing_view"
        return item

    def dict_hook_helpcenterglossary(self, item):
        item["@type"] = "Folder"
        item["layout"] = "listing_view"
        return item

    def dict_hook_helpcenterinstructionalvideo(self, item):
        item["@type"] = "File"
        if item.get("video_file"):
            item["file"] = item["video_file"]
        return item

    def dict_hook_helpcenterlink(self, item):
        item["@type"] = "Link"
        item["remoteUrl"] = item.get("url", None)
        return item

    def dict_hook_helpcenterreferencemanualpage(self, item):
        item["@type"] = "Document"
        return item

If you change types during import you need to take care of other cases where types are referenced.\
Examples are collection-queries (see "Fixing invalid collection queries" below) or constrains (see here):

.. code-block:: python

    PORTAL_TYPE_MAPPING = {
        "Topic": "Collection",
        "FormFolder": "EasyForm",
        "HelpCenter": "Folder",
    }

    def global_dict_hook(self, item):
        if item.get("exportimport.constrains"):
            types_fixed = []
            for portal_type in item["exportimport.constrains"]["locally_allowed_types"]:
                if portal_type in PORTAL_TYPE_MAPPING:
                    types_fixed.append(PORTAL_TYPE_MAPPING[portal_type])
                elif portal_type in ALLOWED_TYPES:
                    types_fixed.append(portal_type)
            item["exportimport.constrains"]["locally_allowed_types"] = list(set(types_fixed))

            types_fixed = []
            for portal_type in item["exportimport.constrains"]["immediately_addable_types"]:
                if portal_type in PORTAL_TYPE_MAPPING:
                    types_fixed.append(PORTAL_TYPE_MAPPING[portal_type])
                elif portal_type in ALLOWED_TYPES:
                    types_fixed.append(portal_type)
            item["exportimport.constrains"]["immediately_addable_types"] = list(set(types_fixed))
        return item


Change workflow
---------------

.. code-block:: python

    REVIEW_STATE_MAPPING = {
        "internal": "published",
        "internally_published": "published",
        "obsolete": "private",
        "hidden": "private",
    }

    def global_dict_hook(self, item):
        if item.get("review_state") in REVIEW_STATE_MAPPING:
            item["review_state"] = REVIEW_STATE_MAPPING[item["review_state"]]
        return item


Export/Import Annotations
-------------------------

Some core-features of Plone (e.g. comments) use annotations to store data.
The core features are already covered but your custom code or community add-ons may use annotations as well.
Here is how you can migrate them.

**Export**: Only export those Annotations that your really need.

.. code-block:: python

    from zope.annotation.interfaces import IAnnotations
    ANNOTATIONS_TO_EXPORT = [
        "syndication_settings",
    ]
    ANNOTATIONS_KEY = 'exportimport.annotations'

    class CustomExportContent(ExportContent):

        def global_dict_hook(self, item, obj):
            item = self.export_annotations(item, obj)
            return item

        def export_annotations(self, item, obj):
            results = {}
            annotations = IAnnotations(obj)
            for key in ANNOTATIONS_TO_EXPORT:
                data = annotations.get(key)
                if data:
                    results[key] = IJsonCompatible(data, None)
            if results:
                item[ANNOTATIONS_KEY] = results
            return item

**Import**:

.. code-block:: python

    from zope.annotation.interfaces import IAnnotations
    ANNOTATIONS_KEY = "exportimport.annotations"

    class CustomImportContent(ImportContent):

        def global_obj_hook(self, obj, item):
            item = self.import_annotations(obj, item)
            return item

        def import_annotations(self, obj, item):
            annotations = IAnnotations(obj)
            for key in item.get(ANNOTATIONS_KEY, []):
                annotations[key] = item[ANNOTATIONS_KEY][key]
            return item

Some features also store data in annotations on the portal, e.g. `plone.contentrules.localassignments`, `plone.portlets.categoryblackliststatus`, `plone.portlets.contextassignments`, `syndication_settings`.
Depending on your requirements you may want to export and import those as well.


Export/Import Marker Interfaces
-------------------------------

**Export**: You may only want to export the marker-interfaces you need.
It is a good idea to inspect a list of all used marker interfaces in a portal before deciding what to migrate.

.. code-block:: python

    from zope.interface import directlyProvidedBy

    MARKER_INTERFACES_TO_EXPORT = [
        "collective.easyslider.interfaces.ISliderPage",
        "plone.app.layout.navigation.interfaces.INavigationRoot",
    ]
    MARKER_INTERFACES_KEY = "exportimport.marker_interfaces"

    class CustomExportContent(ExportContent):

        def global_dict_hook(self, item, obj):
            item = self.export_marker_interfaces(item, obj)
            return item

        def export_marker_interfaces(self, item, obj):
            interfaces = [i.__identifier__ for i in directlyProvidedBy(obj)]
            interfaces = [i for i in interfaces if i in MARKER_INTERFACES_TO_EXPORT]
            if interfaces:
                item[MARKER_INTERFACES_KEY] = interfaces
            return item

**Import**:

.. code-block:: python

    from plone.dexterity.utils import resolveDottedName
    from zope.interface import alsoProvides

    MARKER_INTERFACES_KEY = "exportimport.marker_interfaces"

    class CustomImportContent(ImportContent):

        def global_obj_hook_before_deserializing(self, obj, item):
            """Apply marker interfaces before deserializing."""
            for iface_name in item.pop(MARKER_INTERFACES_KEY, []):
                try:
                    iface = resolveDottedName(iface_name)
                    if not iface.providedBy(obj):
                        alsoProvides(obj, iface)
                        logger.info("Applied marker interface %s to %s", iface_name, obj.absolute_url())
                except ModuleNotFoundError:
                    pass
            return obj, item

Skip versioning during import
-----------------------------

The event-handlers of versioning can seriously slow down your imports.
It is a good idea to skip it before the import:

.. code-block:: python

    VERSIONED_TYPES = [
        "Document",
        "News Item",
        "Event",
        "Link",
    ]

    def start(self):
        self.items_without_parent = []
        portal_types = api.portal.get_tool("portal_types")
        for portal_type in VERSIONED_TYPES:
            fti = portal_types.get(portal_type)
            behaviors = list(fti.behaviors)
            if 'plone.versioning' in behaviors:
                logger.info(f"Disable versioning for {portal_type}")
                behaviors.remove('plone.versioning')
            fti.behaviors = behaviors

Re-enable versioning and create initial versions after all imports and fixes are done, e.g in the view ``@@import_all``.

.. code-block:: python

    from Products.CMFEditions.interfaces.IModifier import FileTooLargeToVersionError

    VERSIONED_TYPES = [
        "Document",
        "News Item",
        "Event",
        "Link",
    ]

    class ImportAll(BrowserView):

        # re-enable versioning
        portal_types = api.portal.get_tool("portal_types")
        for portal_type in VERSIONED_TYPES:
            fti = portal_types.get(portal_type)
            behaviors = list(fti.behaviors)
            if "plone.versioning" not in behaviors:
                behaviors.append("plone.versioning")
                logger.info(f"Enable versioning for {portal_type}")
            if "plone.locking" not in behaviors:
                behaviors.append("plone.locking")
                logger.info(f"Enable locking for {portal_type}")
            fti.behaviors = behaviors
        transaction.get().note("Re-enabled versioning")
        transaction.commit()

        # create initial version for all versioned types
        logger.info("Creating initial versions")
        portal_repository = api.portal.get_tool("portal_repository")
        brains = api.content.find(portal_type=VERSIONED_TYPES)
        total = len(brains)
        for index, brain in enumerate(brains):
            obj = brain.getObject()
            try:
                portal_repository.save(obj=obj, comment="Imported Version")
            except FileTooLargeToVersionError:
                pass
            if not index % 1000:
                msg = f"Created versions for {index} of {total} items."
                logger.info(msg)
                transaction.get().note(msg)
                transaction.commit()
        msg = "Created initial versions"
        transaction.get().note(msg)
        transaction.commit()


Dealing with validation errors
------------------------------

Sometimes you get validation-errors during import because the data cannot be validated.
That can happen when options in a field are generated from content in the site.
In these cases you cannot be sure that all options already exist in the portal while importing the content.

It may also happen, when you have validators that rely on content or configuration that does not exist on import.

.. note::

    For relation fields this is not necessary since relations are imported after content anyway!

There are two ways to handle these issues:

* Use a simple setter bypassing the validation used by the restapi
* Defer the import until all other imports were run


Use a simple setter
*******************

You need to specify which content-types and fields you want to handle that way.

It is put in a key, that the normal import will ignore and set using ``setattr()`` before deserializing the rest of the data.

.. code-block:: python

    SIMPLE_SETTER_FIELDS = {
        "ALL": ["some_shared_field"],
        "CollaborationFolder": ["allowedPartnerDocTypes"],
        "DocType": ["automaticTransferTargets"],
        "DPDocument": ["scenarios"],
        "DPEvent" : ["Status"],
    }

    class CustomImportContent(ImportContent):

        def global_dict_hook(self, item):
            simple = {}
            for fieldname in SIMPLE_SETTER_FIELDS.get("ALL", []):
                if fieldname in item:
                    value = item.pop(fieldname)
                    if value:
                        simple[fieldname] = value
            for fieldname in SIMPLE_SETTER_FIELDS.get(item["@type"], []):
                if fieldname in item:
                    value = item.pop(fieldname)
                    if value:
                        simple[fieldname] = value
            if simple:
                item["exportimport.simplesetter"] = simple

        def global_obj_hook_before_deserializing(self, obj, item):
            """Hook to modify the created obj before deserializing the data.
            """
            # import simplesetter data before the rest
            for fieldname, value in item.get("exportimport.simplesetter", {}).items():
                setattr(obj, fieldname, value)

.. note::

    Using ``global_obj_hook_before_deserializing`` makes sure that data is there when the event-handlers are run after import.

Defer import
************

You can also wait until all content is imported before setting the values on these fields.
Again you need to find out which fields for which types you want to handle that way.

Here the data is stored in an annotation on the imported object from which it is later read.
This example also supports setting some data with ``setattr`` without validating it:

.. code-block:: python

    from plone.restapi.interfaces import IDeserializeFromJson
    from zope.annotation.interfaces import IAnnotations
    from zope.component import getMultiAdapter

    DEFERRED_KEY = "exportimport.deferred"
    DEFERRED_FIELD_MAPPING = {
        "talk": ["somefield"],
        "speaker": [
            "custom_field",
            "another_field",
        ]
    }
    SIMPLE_SETTER_FIELDS = {"custom_type": ["another_field"]}

    class CustomImportContent(ImportContent):

        def global_dict_hook(self, item):
            # Move deferred values to a different key to not deserialize.
            # This could also be done during export.
            item[DEFERRED_KEY] = {}
            for fieldname in DEFERRED_FIELD_MAPPING.get(item["@type"], []):
                if item.get(fieldname):
                    item[DEFERRED_KEY][fieldname] = item.pop(fieldname)
            return item

        def global_obj_hook(self, obj, item):
            # Store deferred data in an annotation.
            deferred = item.get(DEFERRED_KEY, {})
            if deferred:
                annotations = IAnnotations(obj)
                annotations[DEFERRED_KEY] = {}
                for key, value in deferred.items():
                    annotations[DEFERRED_KEY][key] = value

You then need a new step in the migration to move the deferred values from the annotation to the field:

.. code-block:: python

    class ImportDeferred(BrowserView):

        def __call__(self):
            # This example reuses the form export_other.pt from collective.exportimport
            self.title = "Import deferred data"
            if not self.request.form.get("form.submitted", False):
                return self.index()
            portal = api.portal.get()
            self.results = []
            for brain in api.content.find(DEFERRED_FIELD_MAPPING.keys()):
                obj = brain.getObject()
                self.import_deferred(obj)
            api.portal.show_message(f"Imported deferred data for {len(self.results)} items!", self.request)

        def import_deferred(self, obj):
            annotations = IAnnotations(obj, {})
            deferred = annotations.get(DEFERRED_KEY, None)
            if not deferred:
                return
            # Shortcut for simple fields (e.g. storing strings, uuids etc.)
            for fieldname in SIMPLE_SETTER_FIELDS.get(obj.portal_type, []):
                value = deferred.pop(fieldname, None)
                if value:
                    setattr(obj, fieldname, value)
            if not deferred:
                return
            # This approach validates the values and converts more complex data
            deserializer = getMultiAdapter((obj, self.request), IDeserializeFromJson)
            try:
                obj = deserializer(validate_all=False, data=deferred)
            except Exception as e:
                logger.info("Error while importing deferred data for %s", obj.absolute_url(), exc_info=True)
                logger.info("Data: %s", deferred)
            else:
                self.results.append(obj.absolute_url())
            # cleanup
            del annotations[DEFERRED_KEY]

This additional view obviously needs to be registered:

.. code-block:: text

    <browser:page
        name="import_deferred"
        for="zope.interface.Interface"
        class=".import_content.ImportDeferred"
        template="export_other.pt"
        permission="cmf.ManagePortal"
        />


Handle LinguaPlone content
--------------------------

Export:

.. code-block:: python

    def global_dict_hook(self, item, obj):
        # Find language of the nearest parent with a language
        # Usefull for LinguaPlone sites where some content is languageindependent
        parent = obj.__parent__
        for ancestor in parent.aq_chain:
            if IPloneSiteRoot.providedBy(ancestor):
                # keep language for root content
                nearest_ancestor_lang = item["language"]
                break
            if getattr(ancestor, "getLanguage", None) and ancestor.getLanguage():
                nearest_ancestor_lang = ancestor.getLanguage()
                item["parent"]["language"] = nearest_ancestor_lang
                break

        # This forces "wrong" languages to the nearest parents language
        if "language" in item and item["language"] != nearest_ancestor_lang:
            logger.info(u"Forcing %s (was %s) for %s %s ", nearest_ancestor_lang, item["language"], item["@type"], item["@id"])
            item["language"] = nearest_ancestor_lang

        # set missing language
        if not item.get("language"):
            item["language"] = nearest_ancestor_lang

        # add info on translations to help find the right container
        # usually this idone by export_translations
        # but when migrating from LP to pam you sometimes want to check the
        # tranlation info during import
        if getattr(obj.aq_base, "getTranslations", None) is not None:
            translations = obj.getTranslations()
            if translations:
                item["translation"] = {}
                for lang in translations:
                    uuid = IUUID(translations[lang][0], None)
                    if uuid == item["UID"]:
                        continue
                    translation = translations[lang][0]
                    if not lang:
                        lang = "no_language"
                    item["translation"][lang] = translation.absolute_url()

Import:

.. code-block:: python

    def global_dict_hook(self, item):

        # Adapt this to your site
        languages = ["en", "fr", "de"]
        default_language = "en"
        portal_id = "Plone"

        # No language => lang of parent or default
        if item.get("language") not in languages:
            if item["parent"].get("language"):
                item["language"] = item["parent"]["language"]
            else:
                item["language"] = default_language

        lang = item["language"]

        if item["parent"].get("language") != item["language"]:
            logger.debug(f"Inconsistent lang: item is {lang}, parent is {item['parent'].get('language')} for {item['@id']}")

        # Move item to the correct language-root-folder
        # This is only relevant for items in the site-root.
        # Most items containers are usually looked up by the uuid of the old parent
        url = item["@id"]
        parent_url = item["parent"]["@id"]

        url = url.replace(f"/{portal_id}/", f"/{portal_id}/{lang}/", 1)
        parent_url = parent_url.replace(f"/{portal_id}", f"/{portal_id}/{lang}", 1)

        item["@id"] = url
        item["parent"]["@id"] = parent_url

        return item

Alternative ways to handle items without parent
-----------------------------------------------

Often it is better to export and log items for which no container could be found instead of re-creating the original structure.

.. code-block:: python

    def update(self):
        self.items_without_parent = []

    def create_container(self, item):
        # Override create_container to never create parents
        self.items_without_parent.append(item)

    def finish(self):
        # export content without parents
        if self.items_without_parent:
            data = json.dumps(self.items_without_parent, sort_keys=True, indent=4)
            number = len(self.items_without_parent)
            cfg = getConfiguration()
            filename = 'content_without_parent.json'
            filepath = os.path.join(cfg.clienthome, filename)
            with open(filepath, 'w') as f:
                f.write(data)
            msg = u"Saved {} items without parent to {}".format(number, filepath)
            logger.info(msg)
            api.portal.show_message(msg, self.request)


Export/Import Zope Users
------------------------

By default only users and groups stores in Plone are exported/imported.
You can export/import Zope user like this.

**Export**

.. code-block:: python

    from collective.exportimport.export_other import BaseExport
    from plone import api

    import six

    class ExportZopeUsers(BaseExport):

        AUTO_ROLES = ["Authenticated"]

        def __call__(self, download_to_server=False):
            self.title = "Export Zope users"
            self.download_to_server = download_to_server
            portal = api.portal.get()
            app = portal.__parent__
            self.acl = app.acl_users
            self.pms = api.portal.get_tool("portal_membership")
            data = self.all_zope_users()
            self.download(data)

        def all_zope_users(self):
            results = []
            for user in self.acl.searchUsers():
                data = self._getUserData(user["userid"])
                data['title'] = user['title']
                results.append(data)
            return results

        def _getUserData(self, userId):
            member = self.pms.getMemberById(userId)
            roles = [
                role
                for role in member.getRoles()
                if role not in self.AUTO_ROLES
            ]
            # userid, password, roles
            props = {
                "username": userId,
                "password": json_compatible(self._getUserPassword(userId)),
                "roles": json_compatible(roles),
            }
            return props

        def _getUserPassword(self, userId):
            users = self.acl.users
            passwords = users._user_passwords
            password = passwords.get(userId, "")
            return password

**Import**:

.. code-block:: python

    class ImportZopeUsers(BrowserView):

        def __call__(self, jsonfile=None, return_json=False):
            if jsonfile:
                self.portal = api.portal.get()
                status = "success"
                try:
                    if isinstance(jsonfile, str):
                        return_json = True
                        data = json.loads(jsonfile)
                    elif isinstance(jsonfile, FileUpload):
                        data = json.loads(jsonfile.read())
                    else:
                        raise ("Data is neither text nor upload.")
                except Exception as e:
                    status = "error"
                    logger.error(e)
                    api.portal.show_message(
                        u"Failure while uploading: {}".format(e),
                        request=self.request,
                    )
                else:
                    members = self.import_members(data)
                    msg = u"Imported {} members".format(members)
                    api.portal.show_message(msg, self.request)
                if return_json:
                    msg = {"state": status, "msg": msg}
                    return json.dumps(msg)

            return self.index()

        def import_members(self, data):
            app = self.portal.__parent__
            acl = app.acl_users
            counter = 0
            for item in data:
                username = item["username"]
                password = item.pop("password")
                roles = item.pop("roles", [])
                if not username or not password or not roles:
                    continue
                title = item.pop("title", None)
                acl.users.addUser(username, title, password)
                for role in roles:
                    acl.roles.assignRoleToPrincipal(role, username)
                counter += 1
            return counter


Export/Import properties, registry-settings and installed add-ons
-----------------------------------------------------------------

When you migrate multiple similar sites that are configured manually it can be useful to export and import configuration that was set by hand.

Export/Import installed settings and add-ons
********************************************

This custom export exports and imports some selected settings and add-ons from a Plone 4.3 site.

**Export:**

.. code-block:: python

    from collective.exportimport.export_other import BaseExport
    from logging import getLogger
    from plone import api
    from plone.restapi.serializer.converters import json_compatible

    logger = getLogger(__name__)


    class ExportSettings(BaseExport):
        """Export various settings for haiku sites
        """

        def __call__(self, download_to_server=False):
            self.title = "Export installed add-ons various settings"
            self.download_to_server = download_to_server
            if not self.request.form.get("form.submitted", False):
                return self.index()

            data = self.export_settings()
            self.download(data)

        def export_settings(self):
            results = {}
            addons = []
            qi = api.portal.get_tool("portal_quickinstaller")
            for product in qi.listInstalledProducts():
                if product["id"].startswith("myproject."):
                    addons.append(product["id"])
            results["addons"] = addons

            portal = api.portal.get()
            registry = {}
            registry["plone.email_from_name"] = portal.getProperty('email_from_name', '')
            registry["plone.email_from_address"] = portal.getProperty('email_from_address', '')
            registry["plone.smtp_host"] = getattr(portal.MailHost, 'smtp_host', '')
            registry["plone.smtp_port"] = int(getattr(portal.MailHost, 'smtp_port', 25))
            registry["plone.smtp_userid"] = portal.MailHost.get('smtp_user_id')
            registry["plone.smtp_pass"] = portal.MailHost.get('smtp_pass')
            registry["plone.site_title"] = portal.title

            portal_properties = api.portal.get_tool("portal_properties")
            iprops = portal_properties.imaging_properties
            registry["plone.allowed_sizes"] = iprops.getProperty('allowed_sizes')
            registry["plone.quality"] = iprops.getProperty('quality')
            site_props = portal_properties.site_properties
            if site_props.hasProperty("webstats_js"):
                registry["plone.webstats_js"] = site_props.webstats_js
            results["registry"] = json_compatible(registry)
            return results


**Import:**

The import installs the add-ons and load the settings in the registry.
Since Plone 5 ``portal_properties`` is no longer used.

.. code-block:: python

    from logging import getLogger
    from plone import api
    from plone.registry.interfaces import IRegistry
    from Products.CMFPlone.utils import get_installer
    from Products.Five import BrowserView
    from zope.component import getUtility
    from ZPublisher.HTTPRequest import FileUpload

    import json

    logger = getLogger(__name__)

    class ImportSettings(BrowserView):
        """Import various settings"""

        def __call__(self, jsonfile=None, return_json=False):
            if jsonfile:
                self.portal = api.portal.get()
                status = "success"
                try:
                    if isinstance(jsonfile, str):
                        return_json = True
                        data = json.loads(jsonfile)
                    elif isinstance(jsonfile, FileUpload):
                        data = json.loads(jsonfile.read())
                    else:
                        raise ("Data is neither text nor upload.")
                except Exception as e:
                    status = "error"
                    logger.error(e)
                    api.portal.show_message(
                        "Failure while uploading: {}".format(e),
                        request=self.request,
                    )
                else:
                    self.import_settings(data)
                    msg = "Imported addons and settings"
                    api.portal.show_message(msg, self.request)
                if return_json:
                    msg = {"state": status, "msg": msg}
                    return json.dumps(msg)

            return self.index()

        def import_settings(self, data):
            installer = get_installer(self.context)
            for addon in data["addons"]:
                if not installer.is_product_installed(addon) and installer.is_product_installable(addon):
                    installer.install_product(addon)
                    logger.info(f"Installed addon {addon}")
            registry = getUtility(IRegistry)
            for key, value in data["registry"].items():
                registry[key] = value
                logger.info(f"Imported record {key}: {value}")


Export/Import registry settings
*******************************

The pull-request https://github.com/collective/collective.exportimport/pull/130 has views ``@@export_registry`` and ``@@import_registry``.
These views export and import registry records that do not use the default-setting specified in the schema for that registry record.
The export alone could also be usefull to figure out which settings were modified for a site.

That code will probably not be merged but you can use it in your own projects.

Migrate PloneFormGen to Easyform
--------------------------------

To be able to export PFG as easyform you should use the branch ``migration_features_1.x`` of ``collective.easyform`` in your old site.
Easyform does not need to be installed, we only need the methods ``fields_model`` and ``actions_model``.

Export:

.. code-block:: python

    def dict_hook_formfolder(self, item, obj):
        item["@type"] = "EasyForm"
        item["is_folderish"] = False

        from collective.easyform.migration.fields import fields_model
        from collective.easyform.migration.actions import actions_model

        # this does most of the heavy lifting...
        item["fields_model"] = fields_model(obj)
        item["actions_model"] = actions_model(obj)

        # handle thankspage
        pfg_thankspage = obj.get(obj.getThanksPage(), None)
        if pfg_thankspage:
            item["thankstitle"] = pfg_thankspage.title
            item["thanksdescription"] = pfg_thankspage.Description()
            item["showAll"] = pfg_thankspage.showAll
            item["showFields"] = pfg_thankspage.showFields
            item["includeEmpties"] = pfg_thankspage.includeEmpties
            item["thanksPrologue"] = json_compatible(pfg_thankspage.thanksPrologue.raw)
            item["thanksEpilogue"] = json_compatible(pfg_thankspage.thanksEpilogue.raw)

        # optional
        item["exportimport._inputStorage"] = self.export_saved_data(obj)

        # Drop some PFG fields no longer needed
        obsolete_fields = [
            "layout",
            "actionAdapter",
            "checkAuthenticator",
            "constrainTypesMode",
            "location",
            "thanksPage",
        ]
        for key in obsolete_fields:
            item.pop(key, None)

        # optional: disable tabs for imported forms
        item["form_tabbing"] = False

        # fix some custom validators
        replace_mapping = {
            "request.form['": "request.form['form.widgets.",
            "request.form.get('": "request.form.get('form.widgets.",
            "member and member.id or ''": "member and member.getProperty('id', '') or ''",
        }

        # fix overrides in actions and fields to use form.widgets.xyz instead of xyz
        for schema in ["actions_model", "fields_model"]:
            for old, new in replace_mapping.items():
                if old in item[schema]:
                    item[schema] = item[schema].replace(old, new)

            # add your own fields if you have these issues...
            for fieldname in [
                "email",
                "replyto",
            ]:
                if "request/form/{}".format(fieldname) in item[schema]:
                    item[schema] = item[schema].replace("request/form/{}".format(fieldname), "python: request.form.get('form.widgets.{}')".format(fieldname))

        return item

    def export_saved_data(self, obj):
        actions = {}
        for data_adapter in obj.objectValues("FormSaveDataAdapter"):
            data_adapter_name = data_adapter.getId()
            actions[data_adapter_name] = {}
            cols = data_adapter.getColumnNames()
            column_count_mismatch = False
            for idx, row in enumerate(data_adapter.getSavedFormInput()):
                if len(row) != len(cols):
                    column_count_mismatch = True
                    logger.debug("Column count mismatch at row %s", idx)
                    continue
                data = {}
                for key, value in zip(cols, row):
                    data[key] = json_compatible(value)
                id_ = int(time() * 1000)
                while id_ in actions[data_adapter_name]:  # avoid collisions during export
                    id_ += 1
                data["id"] = id_
                actions[data_adapter_name][id_] = data
            if column_count_mismatch:
                logger.info(
                    "Number of columns does not match for all rows. Some data were skipped in "
                    "data adapter %s/%s",
                    "/".join(obj.getPhysicalPath()),
                    data_adapter_name,
                )
        return actions

Import exported ``PloneFormGen`` data into ``Easyform``:

.. code-block:: python

    def obj_hook_easyform(self, obj, item):
        if not item.get("exportimport._inputStorage"):
            return
        from collective.easyform.actions import SavedDataBTree
        from persistent.mapping import PersistentMapping
        if not hasattr(obj, '_inputStorage'):
            obj._inputStorage = PersistentMapping()
        for name, data in item["exportimport._inputStorage"].items():
            obj._inputStorage[name] = SavedDataBTree()
            for key, row in data.items():
                 obj._inputStorage[name][int(key)] = row


Export and import collective.cover content
------------------------------------------

Export:

.. code-block:: python

    from collective.exportimport.serializer import get_dx_blob_path
    from plone.app.textfield.value import RichTextValue
    from plone.namedfile.file import NamedBlobImage
    from plone.restapi.interfaces import IJsonCompatible
    from z3c.relationfield import RelationValue
    from zope.annotation.interfaces import IAnnotations

    def global_dict_hook(self, item, obj):
        item = self.handle_cover(item, obj)
        return item

    def handle_cover(self, item, obj):
        if ICover.providedBy(obj):
            item['tiles'] = {}
            annotations = IAnnotations(obj)
            for tile in obj.get_tiles():
                annotation_key = 'plone.tiles.data.{}'.format(tile['id'])
                annotation = annotations.get(annotation_key, None)
                if annotation is None:
                    continue
                tile_data = self.serialize_tile(annotation)
                tile_data['type'] = tile['type']
                item['tiles'][tile['id']] = tile_data
        return item

    def serialize_tile(self, annotation):
        data = {}
        for key, value in annotation.items():
            if isinstance(value, RichTextValue):
                value = value.raw
            elif isinstance(value, RelationValue):
                value = value.to_object.UID()
            elif isinstance(value, NamedBlobImage):
                blobfilepath = get_dx_blob_path(value)
                if not blobfilepath:
                    continue
                value = {
                    "filename": value.filename,
                    "content-type": value.contentType,
                    "size": value.getSize(),
                    "blob_path": blobfilepath,
                }
            data[key] = IJsonCompatible(value, None)
        return data

Import:

.. code-block:: python

    from collections import defaultdict
    from collective.exportimport.import_content import get_absolute_blob_path
    from plone.app.textfield.interfaces import IRichText
    from plone.app.textfield.interfaces import IRichTextValue
    from plone.namedfile.file import NamedBlobImage
    from plone.namedfile.interfaces import INamedBlobImageField
    from plone.tiles.interfaces import ITileType
    from zope.annotation.interfaces import IAnnotations
    from zope.component import getUtilitiesFor
    from zope.schema import getFieldsInOrder

    COVER_CONTENT = [
        "collective.cover.content",
    ]

    def global_obj_hook(self, obj, item):
        if item["@type"] in COVER_CONTENT and "tiles" in item:
            item = self.import_tiles(obj, item)

    def import_tiles(self, obj, item):
        RICHTEXT_TILES = defaultdict(list)
        IMAGE_TILES = defaultdict(list)
        for tile_name, tile_type in getUtilitiesFor(ITileType):
            for fieldname, field in getFieldsInOrder(tile_type.schema):
                if IRichText.providedBy(field):
                    RICHTEXT_TILES[tile_name].append(fieldname)
                if INamedBlobImageField.providedBy(field):
                    IMAGE_TILES[tile_name].append(fieldname)

        annotations = IAnnotations(obj)
        prefix = "plone.tiles.data."
        for uid, tile in item["tiles"].items():
            # TODO: Maybe create all tiles that do not need to be defferred?
            key = prefix + uid
            tile_name = tile.pop("type", None)
            # first set raw data
            annotations[key] = item["tiles"][uid]
            for fieldname in RICHTEXT_TILES.get(tile_name, []):
                raw = annotations[key][fieldname]
                if raw is not None and not IRichTextValue.providedBy(raw):
                    annotations[key][fieldname] = RichTextValue(raw, "text/html", "text/x-html-safe")
            for fieldname in IMAGE_TILES.get(tile_name, []):
                data = annotations[key][fieldname]
                if data is not None:
                    blob_path = data.get("blob_path")
                    if not blob_path:
                        continue

                    abs_blob_path = get_absolute_blob_path(obj, blob_path)
                    if not abs_blob_path:
                        logger.info("Blob path %s for tile %s of %s %s does not exist!", blob_path, tile, obj.portal_type, obj.absolute_url())
                        continue
                    # Determine the class to use: file or image.
                    filename = data["filename"]
                    content_type = data["content-type"]

                    # Write the field.
                    with open(abs_blob_path, "rb") as myfile:
                        blobdata = myfile.read()
                    image = NamedBlobImage(
                        data=blobdata,
                        contentType=content_type,
                        filename=filename,
                    )
                    annotations[key][fieldname] = image
        return item


Fixing invalid collection queries
---------------------------------

Some queries changes between Plone 4 and 5.
This fixes the issues.

The actual migration of topics to collections in ``collective.exportimport.serializer.SerializeTopicToJson`` does not (yet) take care of that.

.. code-block:: python

    class CustomImportContent(ImportContent):

        def global_dict_hook(self, item):
            if item["@type"] in ["Collection", "Topic"]:
                item = self.fix_query(item)

        def fix_query(self, item):
            item["@type"] = "Collection"
            query = item.pop("query", [])
            if not query:
                logger.info("Drop item without query: %s", item["@id"])
                return

            fixed_query = []
            indexes_to_fix = [
                "portal_type",
                "review_state",
                "Creator",
                "Subject",
            ]
            operator_mapping = {
                # old -> new
                "plone.app.querystring.operation.selection.is":
                    "plone.app.querystring.operation.selection.any",
                "plone.app.querystring.operation.string.is":
                    "plone.app.querystring.operation.selection.any",
            }

            for crit in query:
                if crit["i"] == "portal_type" and len(crit["v"]) > 30:
                    # Criterion is all types
                    continue

                if crit["o"].endswith("relativePath") and crit["v"] == "..":
                    # relativePath no longer accepts ..
                    crit["v"] = "..::1"

                if crit["i"] in indexes_to_fix:
                    for old_operator, new_operator in operator_mapping.items():
                        if crit["o"] == old_operator:
                            crit["o"] = new_operator

                if crit["i"] == "portal_type":
                    # Some types may have changed their names
                    fixed_types = []
                    for portal_type in crit["v"]:
                        fixed_type = PORTAL_TYPE_MAPPING.get(portal_type, portal_type)
                        fixed_types.append(fixed_type)
                    crit["v"] = list(set(fixed_types))

                if crit["i"] == "review_state":
                    # Review states may have changed their names
                    fixed_states = []
                    for review_state in crit["v"]:
                        fixed_state = REVIEW_STATE_MAPPING.get(review_state, review_state)
                        fixed_states.append(fixed_state)
                    crit["v"] = list(set(fixed_states))

                if crit["o"] == "plone.app.querystring.operation.string.currentUser":
                    crit["v"] = ""

                fixed_query.append(crit)
            item["query"] = fixed_query

            if not item["query"]:
                logger.info("Drop collection without query: %s", item["@id"])
                return
            return item


Migrate to Volto
----------------

You can reuse the migration-code provided by ``@@migrate_to_volto`` in ``plone.volto`` in a migration.
The following example (used for migrating https://plone.org to Volto) can be used to migrate a site from any older version to Plone 6 with Volto.

You need to have the Blocks Conversion Tool (https://github.com/plone/blocks-conversion-tool) running that takes care of migrating richtext-values to Volto-blocks.

See https://6.docs.plone.org/backend/upgrading/version-specific-migration/migrate-to-volto.html for more details on the changes the migration to Volto does.


.. code-block:: python

    from App.config import getConfiguration
    from bs4 import BeautifulSoup
    from collective.exportimport.fix_html import fix_html_in_content_fields
    from collective.exportimport.fix_html import fix_html_in_portlets
    from contentimport.interfaces import IContentimportLayer
    from logging import getLogger
    from pathlib import Path
    from plone import api
    from plone.volto.browser.migrate_to_volto import migrate_richtext_to_blocks
    from plone.volto.setuphandlers import add_behavior
    from plone.volto.setuphandlers import remove_behavior
    from Products.CMFPlone.utils import get_installer
    from Products.Five import BrowserView
    from zope.interface import alsoProvides

    import requests
    import transaction

    logger = getLogger(__name__)

    DEFAULT_ADDONS = []


    class ImportAll(BrowserView):

        def __call__(self):

            request = self.request

            # Check if Blocks-conversion-tool is running
            headers = {
                "Accept": "application/json",
                "Content-Type": "application/json",
            }
            r = requests.post(
                "http://localhost:5000/html", headers=headers, json={"html": "<p>text</p>"}
            )
            r.raise_for_status()

            # Submit a simple form template to trigger the import
            if not request.form.get("form.submitted", False):
                return self.index()

            portal = api.portal.get()
            alsoProvides(request, IContentimportLayer)

            installer = get_installer(portal)
            if not installer.is_product_installed("contentimport"):
                installer.install_product("contentimport")

            # install required add-ons
            for addon in DEFAULT_ADDONS:
                if not installer.is_product_installed(addon):
                    installer.install_product(addon)

            # Fake the target being a classic site even though plone.volto is installed...
            # 1. Allow Folders and Collections (they are disabled in Volto by default)
            portal_types = api.portal.get_tool("portal_types")
            portal_types["Collection"].global_allow = True
            portal_types["Folder"].global_allow = True
            # 2. Enable richtext behavior (otherwise no text will be imported)
            for type_ in ["Document", "News Item", "Event"]:
                add_behavior(type_, "plone.richtext")

            transaction.commit()
            cfg = getConfiguration()
            directory = Path(cfg.clienthome) / "import"

            # Import content
            view = api.content.get_view("import_content", portal, request)
            request.form["form.submitted"] = True
            request.form["commit"] = 500
            view(server_file="Plone.json", return_json=True)
            transaction.commit()

            # Run all other imports
            other_imports = [
                "relations",
                "members",
                "translations",
                "localroles",
                "ordering",
                "defaultpages",
                "discussion",
                "portlets",  # not really useful in Volto
                "redirects",
            ]
            for name in other_imports:
                view = api.content.get_view(f"import_{name}", portal, request)
                path = Path(directory) / f"export_{name}.json"
                if path.exists():
                    results = view(jsonfile=path.read_text(), return_json=True)
                    logger.info(results)
                    transaction.get().note(f"Finished import_{name}")
                    transaction.commit()
                else:
                    logger.info(f"Missing file: {path}")

            # Optional: Run html-fixers on richtext
            fixers = [anchor_fixer]
            results = fix_html_in_content_fields(fixers=fixers)
            msg = "Fixed html for {} content items".format(results)
            logger.info(msg)
            transaction.get().note(msg)
            transaction.commit()

            results = fix_html_in_portlets()
            msg = "Fixed html for {} portlets".format(results)
            logger.info(msg)
            transaction.get().note(msg)
            transaction.commit()

            view = api.content.get_view("updateLinkIntegrityInformation", portal, request)
            results = view.update()
            msg = f"Updated linkintegrity for {results} items"
            logger.info(msg)
            transaction.get().note(msg)
            transaction.commit()

            # Rebuilding the catalog is necessary to prevent issues later on
            catalog = api.portal.get_tool("portal_catalog")
            logger.info("Rebuilding catalog...")
            catalog.clearFindAndRebuild()
            msg = "Finished rebuilding catalog!"
            logger.info(msg)
            transaction.get().note(msg)
            transaction.commit()

            # This uses the blocks-conversion-tool to migrate to blocks
            logger.info("Start migrating richtext to blocks...")
            migrate_richtext_to_blocks()
            msg = "Finished migrating richtext to blocks"
            transaction.get().note(msg)
            transaction.commit()

            # Reuse the migration-form from plon.volto to do some more tasks
            view = api.content.get_view("migrate_to_volto", portal, request)
            # Yes, wen want to migrate default pages
            view.migrate_default_pages = True
            view.slate = True
            logger.info("Start migrating Folders to Documents...")
            view.do_migrate_folders()
            msg = "Finished migrating Folders to Documents!"
            transaction.get().note(msg)
            transaction.commit()

            logger.info("Start migrating Collections to Documents...")
            view.migrate_collections()
            msg = "Finished migrating Collections to Documents!"
            transaction.get().note(msg)
            transaction.commit()

            reset_dates = api.content.get_view("reset_dates", portal, request)
            reset_dates()
            transaction.commit()

            # Disallow folders and collections again
            portal_types["Collection"].global_allow = False
            portal_types["Folder"].global_allow = False

            # Disable richtext behavior again
            for type_ in ["Document", "News Item", "Event"]:
                remove_behavior(type_, "plone.richtext")

            return request.response.redirect(portal.absolute_url())


    def anchor_fixer(text, obj=None):
        """Remove anchors since they are not supported by Volto yet"""
        soup = BeautifulSoup(text, "html.parser")
        for link in soup.find_all("a"):
            if not link.get("href") and not link.text:
                # drop empty links (e.g. anchors)
                link.decompose()
            elif not link.get("href") and link.text:
                # drop links without a href but keep the text
                link.unwrap()
        return soup.decode()


Migrate very old Plone Versions with data created by collective.jsonify
-----------------------------------------------------------------------

Versions older than Plone 4 do not support ``plone.restapi`` which is required to serialize the content used by ``collective.exportimport``.

To migrate Plone 1, 2 and 3 to Plone 6 you can use ``collective.jsonify`` for the export and ``collective.exportimport`` for the import.


Export with collective.jsonify
******************************

Use https://github.com/collective/collective.jsonify to export content.

You include the methods of ``collective.jsonify`` using `External Methods`.
See https://github.com/collective/collective.jsonify/blob/master/docs/install.rst for more info.

To work better with ``collective.exportimport`` you could extend the exported data using the feature ``additional_wrappers``.
Add info on the parent of an item to make it easier for ``collective.exportimport`` to import the data.

Here is a full example for ``json_methods.py`` which should be in ``BUILDOUT_ROOT/parts/instance/Extensions/``

.. code-block:: python

    def extend_item(obj, item):
        """Extend to work better well with collective.exportimport"""
        from Acquisition import aq_parent
        parent = aq_parent(obj)
        item["parent"] = {
            "@id": parent.absolute_url(),
            "@type": getattr(parent, "portal_type", None),
        }
        if getattr(parent.aq_base, "UID", None) is not None:
            item["parent"]["UID"] = parent.UID()

        return item


Here is a full example for ``json_methods.py`` which should be in ``<BUILDOUT_ROOT>/parts/instance/Extensions/``

.. code-block:: python

    from collective.jsonify.export import export_content as export_content_orig
    from collective.jsonify.export import get_item

    EXPORTED_TYPES = [
        "Folder",
        "Document",
        "News Item",
        "Event",
        "Link",
        "Topic",
        "File",
        "Image",
        "RichTopic",
    ]

    EXTRA_SKIP_PATHS = [
        "/Plone/archiv/",
        "/Plone/do-not-import/",
    ]

    # Path from which to continue the export.
    # The export walks the whole site respecting the order.
    # It will ignore everything untill this path is reached.
    PREVIOUS = ""

    def export_content(self):
        return export_content_orig(
            self,
            basedir="/var/lib/zope/json",
            skip_callback=skip_item,
            extra_skip_classname=[],
            extra_skip_id=[],
            extra_skip_paths=EXTRA_SKIP_PATHS,
            batch_start=0,
            batch_size=10000,
            batch_previous_path=PREVIOUS or None,
        )

    def skip_item(item):
        """Return True if the item should be skipped"""
        portal_type = getattr(item, "portal_type", None)
        if portal_type not in EXPORTED_TYPES:
            return True

    def extend_item(obj, item):
        """Extend to work better well with collective.exportimport"""
        from Acquisition import aq_parent
        parent = aq_parent(obj)
        item["parent"] = {
            "@id": parent.absolute_url(),
            "@type": getattr(parent, "portal_type", None),
        }
        if getattr(parent.aq_base, "UID", None) is not None:
            item["parent"]["UID"] = parent.UID()

        return item

To use these create three "External Method" in the ZMI root at the Zope root to use that:

* id: "export_content", module name: "json_methods", function name: "export_content"
* id: "get_item", module name: "json_methods", function name: "get_item"
* id: "extend_item", module name: "json_methods", function name: "extend_item"

Then you can pass the extender to the export using a query-string: http://localhost:8080/Plone/export_content?additional_wrappers=extend_item


Import with collective.jsonify
******************************

Two issues need to be dealt with to allow ``collective.exportimport`` to import the data generated by ``collective.jsonify``.

#. The data is in directories instead of in one large json-file.
#. The json is not in the expected format.

Starting with version 1.8 you can pass an iterator to the import.

You need to create a directory-walker that sorts the json-files the right way.
By default it would import them in the order `1.json`, `10.json`, `100.json`, `101.json` and so on.

.. code-block:: python

    from pathlib import Path

    def filesystem_walker(path=None):
        root = Path(path)
        assert(root.is_dir())
        folders = sorted([i for i in root.iterdir() if i.is_dir() and i.name.isdecimal()], key=lambda i: int(i.name))
        for folder in folders:
            json_files = sorted([i for i in folder.glob("*.json") if i.stem.isdecimal()], key=lambda i: int(i.stem))
            for json_file in json_files:
                logger.debug("Importing %s", json_file)
                item = json.loads(json_file.read_text())
                item["json_file"] = str(json_file)
                item = prepare_data(item)
                if item:
                    yield item

The walker takes the path to be the root with one or more directories holding the json-files.
The sorting of the files is done using the number in the filename.

The method ``prepare_data`` modifies the data before passing it to the import.
A very similar task is done by ``collective.exportimport`` during export.

.. code-block:: python

    def prepare_data(item):
        """modify jsonify data to work with c.exportimport"""

        # Drop relationfields or defer the import
        item.pop("relatedItems", None)

        mapping = {
            # jsonify => exportimport
            "_uid": "UID",
            "_type": "@type",
            "_path": "@id",
            "_layout": "layout",
            # AT fieldnames => DX fieldnames
            "excludeFromNav": "exclude_from_nav",
            "allowDiscussion": "allow_discussion",
            "subject": "subjects",
            "expirationDate": "expires",
            "effectiveDate": "effective",
            "creation_date": "created",
            "modification_date": "modified",
            "startDate": "start",
            "endDate": "end",
            "openEnd": "open_end",
            "eventUrl": "event_url",
            "wholeDay": "whole_day",
            "contactEmail": "contact_email",
            "contactName": "contact_name",
            "contactPhone": "contact_phone",
            "imageCaption": "image_caption",
        }
        for old, new in mapping.items():
            item = migrate_field(item, old, new)

        if item.get("constrainTypesMode", None) == 1:
            item = migrate_field(item, "constrainTypesMode", "constrain_types_mode")
        else:
            item.pop("locallyAllowedTypes", None)
            item.pop("immediatelyAddableTypes", None)
            item.pop("constrainTypesMode", None)

        if "id" not in item:
            item["id"] = item["_id"]
        return item


    def migrate_field(item, old, new):
        if item.get(old, _marker) is not _marker:
            item[new] = item.pop(old)
        return item

You can pass the generator ``filesystem_walker`` to the import:

.. code-block:: python

    class ImportAll(BrowserView):

        def __call__(self):
            # ...
            cfg = getConfiguration()
            directory = Path(cfg.clienthome) / "import"

            # import content
            view = api.content.get_view("import_content", portal, request)
            request.form["form.submitted"] = True
            request.form["commit"] = 1000
            view(iterator=filesystem_walker(directory / "mydata"))

            # import default-pages
            import_deferred = api.content.get_view("import_deferred", portal, request)
            import_deferred()


    class ImportDeferred(BrowserView):

        def __call__(self):
            self.title = "Import Deferred Settings (default pages)"
            if not self.request.form.get("form.submitted", False):
                return self.index()

            for brain in api.content.find(portal_type="Folder"):
                obj = brain.getObject()
                annotations = IAnnotations(obj)
                if DEFERRED_KEY not in annotations:
                    continue

                default = annotations[DEFERRED_KEY].pop("_defaultpage", None)
                if default and default in obj:
                    logger.info("Setting %s as default page for %s", default, obj.absolute_url())
                    obj.setDefaultPage(default)
                if not annotations[DEFERRED_KEY]:
                    annotations.pop(DEFERRED_KEY)
            api.portal.show_message("Done", self.request)
            return self.index()

``collective.jsonify`` puts the info on relations, translations and default-pages in the export-file.
You can use the approach to defer imports to deal with that data after all items were imported.
The example ``ImportDeferred`` above uses that approach to set the default pages.

This ``global_obj_hook`` below stores that data in a annotation:

.. code-block:: python

    def global_obj_hook(self, obj, item):
        # Store deferred data in an annotation.
        keys = ["_defaultpage"]
        data = {}
        for key in keys:
            if value := item.get(key, None):
                data[key] = value
        if data:
            annotations = IAnnotations(obj)
            annotations[DEFERRED_KEY] = data


Translations
============

This product has been translated into

- Spanish


Contribute
==========

- Issue Tracker: https://github.com/collective/collective.exportimport/issues
- Source Code: https://github.com/collective/collective.exportimport


Support
-------

If you are having issues, please let us know.


License
-------

The project is licensed under the GPLv2.


Written by
==========

.. image:: ./docs/starzel.png
    :target: https://www.starzel.de
    :alt: Starzel.de


Contributors
============

- Philip Bauer, bauer@starzel.de

- Maurits van Rees, m.van.rees@zestsoftware.nl

- Fred van Dijk, f.van.dijk@zestsoftware.nl

- Leonardo J. Caballero G., leonardocaballero@gmail.com


Changelog
=========


1.12 (2024-03-08)
-----------------

- Fix migrating blocks to make Volto sites portable and support plone.distribution.
  [pbauer, tlotze]


1.11 (2024-02-28)
-----------------

- Fix ``AtributeError: 'NamedFile' object has no attribute '_blob'`` when using setting
  "Include blobs as blob paths" and exporting objects with
  plone.namedfile.file.NamedFile properties (so not blobs).
  [valipod]

- Add more Python 2 compatible version specifications and update the README.
  [thet]

- Fix ``KeyError: time`` when importing content with a workflow that does not have the ``time`` variable.
  [maurits]

- Allow to use fix_html_in_content_fields without applying the default html_fixer.
  [pbauer]

- Try to restore broken blobs when exporting content.
  [thet]

- When exporting into separate JSON files write also the error in a separate errors.json file.
  This fixes an error at the end of the export and no errors being written.
  [thet]

- Add support for ATTopic export_content
  [avoinea]

- Add principals to groups that already exist during import (#228)
  [pbauer]

- In export_members ignore transitive membership of groups (#240)
  [pbauer]


1.10 (2023-10-11)
-----------------

- Don't re-use `mapping` variable when migrating portlet data.
  [witsch]

- Fix editing revision author - refs #216
  [avoinea]

- Better support for portal import which avoids parsing JSON twice.
  [gotcha]

- Migrate portlets on site root.
  [ThibautBorn]

- Support export & import to have one separate json-file per content item.
  [pbauer]


1.9 (2023-05-18)
----------------

- Allow passing custom filenames to exports
  [pbauer]

- Support export and import of Plone Site root (using update strategy).
  [pbauer]

- Fix blob export when Connection uses TmpStore
  [gotcha, pbauer]

- Fix portlet richtext field import
  [mpeeters]

- Add portlet location on exported data
  [mpeeters]

- Migrate root of portlets that used a path in plone4 to using a uid (navigation, search, events, collection).
  [pbauer]

- Make export of discussions and portlets contextual
  [mpeeters]

- Fix critical bug when importing groups: Do not import groups that a groups belongs to as members of the new group.
  This could have caused groups to have more privileges than they should.
  [pbauer]


1.8 (2023-04-20)
----------------

- Import: run set_uuid method before we call custom hooks, so the hooks have access to
  the item UUID. Fix #185.
- Document COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY in README.
  [fredvd]

- Add Spanish translation.
  [macagua]

- Add i18n support.
  [macagua]

- Fix html: improve mapping from scale to picture variant.  [maurits]

- Allow overriding the fallback variant in img_variant_fixer.
  Use 'medium' by default.
  [maurits]

- Let fix_html view work on the current context.  [maurits]

- Fix the way we get a blob path. (#180)
  [ale-rt]

- Create documents as containers for items without parent when documents are folderish.
  [JeffersonBledsoe]

- Add support for passing any iterator as data-source to the import.
  [pbauer]

- Add example for importing collective.jsonify data to documentation.
  [pbauer]

- Better serialization of Topics:
  - Use newer criteria added in Plone 5
  - Add fallback for some criteria
  - Export sort_on and sort_reversed
  - Export customView as tabular_view
  [pbauer]

- Always import discussions independent if discussion support is enabled or not
  on a particular content object (#182)
  [ajung]


1.7 (2023-01-20)
----------------

- Filter out 'Discussion Item' in content type export list. Comments have their own export and
  import views. A normal content type export for comments will raise a KeyError when trying to find
  the parent. (#112)
  [fredvd]

- Be more specific in the import_translation endpoint condition to install in a site with p.a.multilingual 1.x
  [erral]

- Fix importing hidden portlets as visible. (#152)
  [pbauer]

- Use ``Language=all`` when querying TranslationGroup items
  [erral]

- Fix members import, by handling members that already exist.
  [sunew]

- Don't use new_id because a hook can change ``item["id"]``
  [pbauer]

- Support exporting the blob-path without having access to the blobs.
  [pbauer]

- Set image-variants in html-fields when running @@fix_html targeting in Plone 6.
  [pbauer]


1.6 (2022-10-07)
----------------

- Export and import all group-members (including ldap-users and -groups).
  Previously it only exported users and groups created in Plone.
  [pbauer]

- Support importing content without a UUID (e.g. for importing from an external source).
  The minimal required data is @id, @type, id, and @parent["@id"].
  [pbauer]

- Export only value when serializing vocabulary-based fields instead of token/title.
  [pbauer]

- Improve logging of errors during import.
  [pbauer]

- Add INCLUDE_PATHS to specify which paths only should be imported.
  [pbauer]

- Add import_review_state to allow overriding setting the review_state during import.
  [pbauer]

- Export parent UID and use it to find the container to import.
  [pbauer]

- Move the various export-hooks into update_export_data for readability.
  [pbauer]

- Support export to server by passing ``download_to_server=True`` for all exports (#115).
  [pbauer]

- Add support for adding custom html-fixers to fix_html_in_content_fields.
  [pbauer]


1.5 (2022-04-26)
----------------

- Fix AttributeError for getPhysicalPath when checking parent, issue 123.
  [maurits]

- Export and import redirection tool data.
  [gotcha, Michael Penninck]

- Serialize Products.TALESField fields as raw instead of evaluated expression.
  (useful to export PFG overrides)
  [sauzher]

- Make sure we never change a acquired modification_date or creation_date.
  [pbauer]

- Export and import workflow_history.
  [pbauer]

- Fail gracefully on errors during importing portlets.
  [pbauer]

- Ignore containers where content should be imported to that are non-folderish.
  [pbauer]

- Use catalog instead of ZopeFindAndApply and better logging for export_discussion.
  [pbauer]

- Add converter for long ints (py2 only).
  [pbauer]

- By default no not export linkintegrity relations.
  [pbauer]

- Log detailed exception when exporting content fails.
  [pbauer]

- Add start and finish hooks for export of content.
  [pbauer]

- Rewrite export/import of default pages: Use uuid of default-page instead of id.
  Rewrite getting default_page to fix various issues with translated content.
  [pbauer]

- Add export and import of versions/revisions of content (#105).
  [pbauer]


1.4 (2022-01-07)
----------------

- Fix ``debug`` flag in ``ExportRelations``
  [petschki]

- Deserialize portlet-data using restapi to fix importing RichText.
  [pbauer]

- Fix importing richtext with html-entities. Fixes #99
  [pbauer]

- Preserve links to browser-views by using a custom find_object. Fixes #97
  [pbauer]

- Ignore linkintegrity when importing items with replace-strategy.
  [pbauer]

- Add tests for fix_html.
  [pbauer]


1.3 (2021-12-08)
----------------

- Handle default page of the site root object.
  [fulv]

- Optionally (checkbox) skip existing content on import instead of generating it new with a randomized id.
  [petschki]

- Fix `UnboundLocalError` when calling `import_content` with `return_json` and `server_file`.
  [petschki]

- Add option to make a commit every x items.
  [pbauer]

- Improve logging during import in vairous cases.
  [pbauer]

- Work around case where api.content.get(path=parent_path) raises NotFound instead of returning None.
  [pbauer]

- Keep value of import_to_current_folder.
  [pbauer]

- Fix html unescape in py3.
  [pbauer]

- Fix serializing ATNewsItem image field content.
  [gotcha]

- Migrate eventUrl to event_url (AT to DX).
  [ThibautBorn]

- Log items that cannot be serialized instead of aborting the export.
  [ThibautBorn]

- Add a item_hook to export_localroles.
  [ThibautBorn]

- Fix handling of checkboxes for skip_existing_content and import_to_current_folder.
  [pbauer]

- Move intermediary commit code into commit_hook method to allow overriding.
  [pbauer]

- Add hook global_obj_hook_before_deserializing to modify the created obj before deserializing the data.
  [pbauer]

- Add support to update and to replace existing content during import (#76)
  [pbauer]

- Reindex permissions after importing local roles.
  [pbauer]

- Add export/import for constrains but import content without checking constrains or permissions (#71).
  [pbauer]


1.2 (2021-10-11)
----------------

- Prevent creating content in a different Plone Site in the same database (#52).
  In general, cleanup parent paths when in development on localhost.
  [maurits]

- Read environment variable ``COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY`` (#51).
  When set, this is used for storing an export file and getting an import file.
  This is useful for sharing content between multiple Plone Sites on the same server.
  [maurits]

- Unescape html entities and line-breaks when importing comments (#43).
  [pbauer]

- Export and import complete sites or content trees with configurable types, depth and path (#40).
  [pbauer]

- Added option to export blobs as blob paths (#50).
  [pbauer, maurits]

- Fixed creating missing folder structure (#45).
  [maurits]

- Export and import portlets (#39).
  [pbauer]

- Export content and write to file using a generator/yield. This avoids memory ballooning to the size of the exported file (#41).
  [fredvd]


1.1 (2021-08-02)
----------------

- Add option to import file from server.
  [maurits]

- Missing ``</form>`` closing tag in ``export_content.pt``
  [petschki]

- Support disabled aquisition of local roles during export/import of local roles.
  [pbauer]

- Use unrestrictedSearchResults to actually export all content.
  [pbauer]

- Add commit message after importing one type.
  [pbauer]

- Fix getting container for some cases.
  [pbauer]

- Fix use in Plone 4.3 without dexterity, zc.relation or plone.app.contenttypes.
  [pbauer]

- Fix @id of collections and parents of subcollections. Fix #30
  [pbauer]

- Fix use in Plone 4.3 with dexterity but without z3c.relationfield.
  [maurits]

- Add export and import for discussions/comments.
  [pbauer]

- Add option to fix collection queries after import.
  [thomasmassmann]

- Reset Creation Date. Fix #29
  [pbauer]

- Remove custom serializer for relations beacuse of ConfigurationConflictError with restapi.
  Relations are dropped anyway in update_data_for_migration when using the default setting.
  [pbauer]

- Migrate batch size for topics.
  [pbauer]

- Fix issue of reusing the previous container when no container for a item could be found.
  [pbauer]

- Add hook self.finish() to do things after importing one file.
  [pbauer]

- Fix installation with older versions of setuptools (#35)
  [pbauer]

- Fix installation using pip (#36)
  [ericof]

- Do not constrain exportable FTIs to allow export of types as CalendarXFolder or ATTopic Criteria.
  [pbauer]

- Add hook self.start() to do things after importing one file.
  [pbauer]


1.0 (2021-04-27)
----------------

- Support setting values with ``factory_kwargs`` when creating instances during import.
  This can be used to set values that need to be there during subscribers to IObjectAddedEvent.
  [pbauer]


1.0b1 (2021-03-26)
------------------

- Add option to save export on server.
  [pbauer]

- Fix issues in import_relations and import_ordering.
  [pbauer]

- Use links to other exports in export_content for easier override.
  [pbauer]

- Add support for exporting LinguaPlone translations.
  [pbauer]


1.0a2 (2021-03-11)
------------------

- Simplify package structure and remove all unneeded files
  [pbauer]

- Add export/import for position in parent
  [pbauer]


1.0a1 (2021-03-10)
------------------

- Initial release.
  [pbauer]

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/collective/collective.exportimport",
    "name": "collective.exportimport",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*",
    "maintainer_email": "",
    "keywords": "Python Plone CMS",
    "author": "Philip Bauer (for starzel.de)",
    "author_email": "info@starzel.de",
    "download_url": "https://files.pythonhosted.org/packages/a1/aa/40140443bf647e9ea87ed442cf15c6c10533bd96d92786fd2b9422897288/collective.exportimport-1.12.tar.gz",
    "platform": null,
    "description": ".. This README is meant for consumption by humans and PyPI. PyPI can render reStructuredText files, so please do not use Sphinx features.\n   If you want to learn more about writing documentation, please check out: https://6.docs.plone.org/contributing/documentation/\n   This text does not appear on PyPI or GitHub. It is a comment.\n\n.. image:: https://img.shields.io/pypi/v/collective.exportimport.svg\n    :target: https://pypi.org/project/collective.exportimport/\n    :alt: Latest Version\n\n.. image:: https://img.shields.io/pypi/status/collective.exportimport.svg\n    :target: https://pypi.org/project/collective.exportimport/\n    :alt: Egg Status\n\n.. image:: https://img.shields.io/pypi/pyversions/collective.exportimport.svg?style=plastic   :alt: Supported - Python Versions\n\n.. image:: https://img.shields.io/pypi/l/collective.exportimport.svg\n    :target: https://pypi.org/project/collective.exportimport/\n    :alt: License\n\n\n=======================\ncollective.exportimport\n=======================\n\nExport and import content, members, relations, translations, localroles and much more.\n\nExport and import all kinds of data from and to Plone sites using a intermediate json-format.\nThe main use-case is migrations since it enables you to for example migrate from Plone 4 with Archetypes and Python 2 to Plone 6 with Dexterity and Python 3 in one step.\nMost features use `plone.restapi` to serialize and deserialize data.\n\nSee also the training on migrating with ``exportimport``: https://training.plone.org/migrations/exportimport.html\n\n.. contents:: Contents\n    :local:\n\nFeatures\n========\n\n* Export & Import content\n* Export & Import members and groups with their roles\n* Export & Import relations\n* Export & Import translations\n* Export & Import local roles\n* Export & Import order (position in parent)\n* Export & Import discussions/comments\n* Export & Import versioned content\n* Export & Import redirects\n\nExport supports:\n\n* Plone 4, 5 and 6\n* Archetypes and Dexterity\n* Python 2 and 3\n* plone.app.multilingual, Products.LinguaPlone, raptus.multilanguagefields\n\nImport supports:\n\n* Plone 5.2+, Dexterity, Python 2 and 3, plone.app.multilingual\n\n\nInstallation\n============\n\nInstall collective.exportimport as you would install any other Python package.\n\nYou don't need to activate the add-on in the Site Setup Add-ons control panel to be able to use the forms ``@@export_content`` and ``@@import_content`` in your site.\n\nIf you need help, see:\n- for Plone 4: https://4.docs.plone.org/adapt-and-extend/install_add_ons.html\n- for Plone 5: https://5.docs.plone.org/manage/installing/installing_addons.html\n- for Plone 6: https://6.docs.plone.org/install/manage-add-ons-packages.html\n\n\nPython 2 compatibility\n----------------------\n\nThis package is compatible with Python 3 and Python 2.\nDepending on the Python version different versions of it's dependencies will be installed.\nIf you run into problems, file an issue at: https://github.com/collective/collective.exportimport/issues\n\n\nUsage\n=====\n\nExport\n------\n\nUse the form with the URL ``/@@export_content``, and select what you want to export:\n\n.. image:: ./docs/export.png\n\nYou can export one or more types and a whole site or only a specific path in a site. Since items are exported ordered by path importing them will create the same structure as you had originally.\n\nThe downloaded json-file will have the name of the path you exported from, e.g. ``Plone.json``.\n\nThe exports for members, relations, localroles and relations are linked to in this form but can also be called individually: ``/@@export_members``, ``/@@export_relations``, ``/@@export_localroles``, ``/@@export_translations``, ``/@@export_ordering``, ``/@@export_discussion``.\n\n\nImport\n------\n\nUse the form with the URL ``/@@import_content``, and upload a json-file that you want to import:\n\n.. image:: ./docs/import.png\n\n\nThe imports for members, relations, localroles and relations are linked to in this form but can also be called individually: ``/@@import_members``, ``/@@import_relations``, ``/@@import_localroles``, ``/@@import_translations``, ``/@@import_ordering``, ``/@@import_discussion``.\n\nAs a last step in a migration there is another view ``@@reset_dates`` that resets the modified date on imported content to the date initially contained in the imported json-file. This is necessary since varous changes during a migration will likely result in a updated modified-date. During import the original is stored as ``obj.modification_date_migrated`` on each new object and this view sets this date.\n\nExport- and import locations\n----------------------------\n\nIf you select 'Save to file on server', the Export view will save json files in the <var> directory of your Plone instanc in /var/instance.\nThe import view will look for  files under /var/instance/import.\nThese directories will normally be different, under different Plone instances and possibly on different servers.\n\nYou can set the environment variable 'COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY' to add a 'shared' directory on one server or maybe network share.\nWith this variable set, collective.exportimport will both save to and load .json files from the same server directory.\nThis saves time not having to move .json files around from the export- to the import location.\nYou should be aware that the Export views will overwrite any existing previous .json file export that have the same name.\n\n\nUse-cases\n=========\n\nMigrations\n----------\n\nWhen a in-place-migration is not required you can choose this add-on to migrate the most important parts of your site to json and then import it into a new Plone instance of your targeted version:\n\n* Export content from a Plone site (it supports Plone 4 and 5, Archetypes and Dexterity, Python 2 and 3).\n* Import the exported content into a new site (Plone 5.2+, Dexterity, Python 3)\n* Export and import relations, users and groups with their roles, translations, local roles, ordering, default-pages, comments, portlets and redirects.\n\nHow to migrate additional features like Annotations or Marker Interfaces is discussed in the FAQ section.\n\nOther\n-----\n\nYou can use this add-on to\n\n* Archive your content as JSON.\n* Export data to prepare a migration to another system.\n* Combine content from multiple plone-sites into one.\n* Import a plone-site as a subsite into another.\n* Import content from other systems as long as it fits the required format.\n* Update or replace existing data.\n\nDetails\n=======\n\nExport content\n--------------\n\nExporting content is basically a wrapper for the serializers of plone.restapi:\n\n.. code-block:: python\n\n    from plone.restapi.interfaces import ISerializeToJson\n    from zope.component import getMultiAdapter\n\n    serializer = getMultiAdapter((obj, request), ISerializeToJson)\n    data = serializer(include_items=False)\n\nImport content\n--------------\n\nImporting content is a elaborate wrapper for the deserializers of plone.restapi:\n\n.. code-block:: python\n\n    from plone.restapi.interfaces import IDeserializeFromJson\n    from zope.component import getMultiAdapter\n\n    container.invokeFactory(item['@type'], item['id'])\n    deserializer = getMultiAdapter((new, self.request), IDeserializeFromJson)\n    new = deserializer(validate_all=False, data=item)\n\n\nUse for migrations\n------------------\n\nA main use-case of this package is migration from one Plone-Version to another.\n\nExporting Archetypes content and importing that as Dexterity content works fine but due to changes in field-names some settings would get lost.\nFor example the setting to exclude content from the navigation was renamed from ``excludeFromNav`` to ``exclude_from_nav``.\n\nTo fix this you can check the checkbox \"Modify exported data for migrations\".\nThis will modify the data during export:\n\n* Drop unused data (e.g. `next_item` and `components`)\n* Remove all relation fields\n* Change some field names that changed between Archetypes and Dexterity\n\n  * ``excludeFromNav`` \u2192 ``exclude_from_nav``\n  * ``allowDiscussion`` \u2192 ``allow_discussion``\n  * ``subject`` \u2192 ``subjects``\n  * ``expirationDate`` \u2192 ``expires``\n  * ``effectiveDate`` \u2192 ``effective``\n  * ``creation_date`` \u2192 ``created``\n  * ``modification_date`` \u2192 ``modified``\n  * ``startDate`` \u2192 ``start``\n  * ``endDate`` \u2192 ``end``\n  * ``openEnd`` \u2192 ``open_end``\n  * ``wholeDay`` \u2192 ``whole_day``\n  * ``contactEmail`` \u2192 ``contact_email``\n  * ``contactName`` \u2192 ``contact_name``\n  * ``contactPhone`` \u2192 ``contact_phone``\n\n* Update view names on Folders and Collection that changed since Plone 4.\n* Export ``ATTopic`` and their criteria to Collections with querystrings.\n* Update Collection-criteria.\n* Links and images in Richtext-Fields of content and portlets have changes since Plone 4.\n  the view ``/@@fix_html`` allows you to fix these.\n\n\nControl creating imported content\n---------------------------------\n\nYou can choose between four options how to deal with content that already exists:\n\n  * Skip: Don't import at all\n  * Replace: Delete item and create new\n  * Update: Reuse and only overwrite imported data\n  * Ignore: Create with a new id\n\nImported content is initially created with ``invokeFactory`` using portal_type and id of the exported item before deserializing the rest of the data.\nYou can set additional values by specifying a dict ``factory_kwargs`` that will be passed to the factory.\nLike this you can set values on the imported object that are expected to be there by subscribers to IObjectAddedEvent.\n\n\nExport versioned content\n------------------------\n\nExporting versions of Archetypes content will not work because of a bug in plone.restapi (https://github.com/plone/plone.restapi/issues/1335).\nFor export to work you need to use a version between 7.7.0 and 8.0.0 (if released) or a source-checkout of the branch 7.x.x.\n\n\nNotes on speed and large migrations\n===================================\n\nExporting and importing large amounts of content can take a while. Export is pretty fast but import is constrained by some features of Plone, most importantly versioning:\n\n* Importing 5000 Folders takes ~5 minutes\n* Importing 5000 Documents takes >25 minutes because of versioning.\n* Importing 5000 Documents without versioning takes ~7 minutes.\n\nDuring import you can commit every x number of items which will free up memory and disk-space in your TMPDIR (where blobs are added before each commit).\n\nWhen exporting large numbers of blobs (binary files and images) you will get huge json-files and may run out of memory.\nYou have various options to deal with this.\nThe best way depends on how you are going to import the blobs:\n\n- Export as download urls: small download, but ``collective.exportimport`` cannot import the blobs, so you will need an own import script to download them.\n- Export as base-64 encoded strings: large download, but ``collective.exportimport`` can handle the import.\n- Export as blob paths: small download and ``collective.exportimport`` can handle the import, but you need to copy ``var/blobstorage`` to the Plone Site where you do the import or set the environment variable ``COLLECTIVE_EXPORTIMPORT_BLOB_HOME`` to the old blobstorage path: ``export COLLECTIVE_EXPORTIMPORT_BLOB_HOME=/path-to-old-instance/var/blobstorage``.\n  To export the blob-path you do not need to have access to the blobs!\n\n\nFormat of export and import of content\n======================================\n\nBy default all content is exported to and imported from one large json-file.\nTo inspect such very large json-files without performance-issues you can use klogg (https://klogg.filimonov.dev).\n\nSince version 1.10 collective.exportimport also supports exporting and importing each content item as a separate json-file.\nTo use that select *Save each item as a separate file on the server* in the form or specify ``download_to_server=2`` when calling the export in python.\nIn the import-form you can manually select a directory on the server or specify ``server_directory=\"/mydir\"`` when calling the import in python.\n\n\nCustomize export and import\n===========================\n\nThis add-on is designed to be adapted to your requirements and has multiple hooks to make that easy.\n\nTo make that easier here are packages you can reuse to override and extend the export and import.\nUse these templates and adapt them to your own projects:\n\n* https://github.com/starzel/contentexport\n* https://github.com/starzel/contentimport\n\nMany examples for customizing the export and import are collected in the chapter \"FAQ, Tips and Tricks\" below.\n\n.. note::\n\n    As a rule of thumb you should make changes to the data during import unless you need access to the original object for the required changes.\n    One reason is that this way the serialized content in the json-file more closely represents the original data.\n    Another reason is that it allows you to fix issues during the process you are currently developing (i.e. without having to redo the export).\n\n\nExport Example\n--------------\n\n.. code-block:: python\n\n    from collective.exportimport.export_content import ExportContent\n\n    class CustomExportContent(ExportContent):\n\n        QUERY = {\n            'Document': {'review_state': ['published', 'pending']},\n        }\n\n        DROP_PATHS = [\n            '/Plone/userportal',\n            '/Plone/en/obsolete_content',\n        ]\n\n        DROP_UIDS = [\n            '71e3e0a6f06942fea36536fbed0f6c42',\n        ]\n\n        def update(self):\n            \"\"\"Use this to override stuff before the export starts\n            (e.g. force a specific language in the request).\"\"\"\n\n        def start(self):\n            \"\"\"Hook to do something before export.\"\"\"\n\n        def finish(self):\n            \"\"\"Hook to do something after export.\"\"\"\n\n        def global_obj_hook(self, obj):\n            \"\"\"Inspect the content item before serialisation data.\n            Bad: Changing the content-item is a horrible idea.\n            Good: Return None if you want to skip this particular object.\n            \"\"\"\n            return obj\n\n        def global_dict_hook(self, item, obj):\n            \"\"\"Use this to modify or skip the serialized data.\n            Return None if you want to skip this particular object.\n            \"\"\"\n            return item\n\n        def dict_hook_document(self, item, obj):\n            \"\"\"Use this to modify or skip the serialized data by type.\n            Return the modified dict (item) or None if you want to skip this particular object.\n            \"\"\"\n            return item\n\n\nRegister it with your own browserlayer to override the default.\n\n.. code-block:: text\n\n  <browser:page\n      name=\"export_content\"\n      for=\"zope.interface.Interface\"\n      class=\".custom_export.CustomExportContent\"\n      layer=\"My.Custom.IBrowserlayer\"\n      permission=\"cmf.ManagePortal\"\n      />\n\n\nImport Example\n--------------\n\n.. code-block:: python\n\n    from collective.exportimport.import_content import ImportContent\n\n    class CustomImportContent(ImportContent):\n\n        CONTAINER = {'Event': '/imported-events'}\n\n        # These fields will be ignored\n        DROP_FIELDS = ['relatedItems']\n\n        # Items with these uid will be ignored\n        DROP_UIDS = ['04d1477583c74552a7fcd81a9085c620']\n\n        # These paths will be ignored\n        DROP_PATHS = ['/Plone/doormat/', '/Plone/import_files/']\n\n        # Default values for some fields\n        DEFAULTS = {'which_price': 'normal'}\n\n        def start(self):\n            \"\"\"Hook to do something before importing one file.\"\"\"\n\n        def finish(self):\n            \"\"\"Hook to do something after importing one file.\"\"\"\n\n        def global_dict_hook(self, item):\n            if isinstance(item.get('description', None), dict):\n                item['description'] = item['description']['data']\n            if isinstance(item.get('rights', None), dict):\n                item['rights'] = item['rights']['data']\n            return item\n\n        def dict_hook_customtype(self, item):\n            # change the type\n            item['@type'] = 'anothertype'\n            # drop a field\n            item.pop('experiences', None)\n            return item\n\n        def handle_file_container(self, item):\n            \"\"\"Use this to specify the container in which to create the item in.\n            Return the container for this particular object.\n            \"\"\"\n            return self.portal['imported_files']\n\nRegister it:\n\n.. code-block:: text\n\n  <browser:page\n      name=\"import_content\"\n      for=\"zope.interface.Interface\"\n      class=\".custom_import.CustomImportContent\"\n      layer=\"My.Custom.IBrowserlayer\"\n      permission=\"cmf.ManagePortal\"\n      />\n\n\nAutomate export and import\n--------------------------\n\nRun all exports and save all data in ``var/instance/``:\n\n.. code-block:: python\n\n    from plone import api\n    from Products.Five import BrowserView\n\n    class ExportAll(BrowserView):\n\n        def __call__(self):\n            export_content = api.content.get_view(\"export_content\", self.context, self.request)\n            self.request.form[\"form.submitted\"] = True\n            export_content(\n                portal_type=[\"Folder\", \"Document\", \"News Item\", \"File\", \"Image\"],  # only export these\n                include_blobs=2,  # Export files and images as blob paths\n                download_to_server=True)\n\n            other_exports = [\n                \"export_relations\",\n                \"export_members\",\n                \"export_translations\",\n                \"export_localroles\",\n                \"export_ordering\",\n                \"export_defaultpages\",\n                \"export_discussion\",\n                \"export_portlets\",\n                \"export_redirects\",\n            ]\n            for name in other_exports:\n                view = api.content.get_view(name, portal, request)\n                # This saves each export in var/instance/export_xxx.json\n                view(download_to_server=True)\n\n            # Important! Redirect to prevent infinite export loop :)\n            return self.request.response.redirect(self.context.absolute_url())\n\nRun all imports using the data exported in the example above:\n\n.. code-block:: python\n\n    from collective.exportimport.fix_html import fix_html_in_content_fields\n    from collective.exportimport.fix_html import fix_html_in_portlets\n    from pathlib import Path\n    from plone import api\n    from Products.Five import BrowserView\n\n\n    class ImportAll(BrowserView):\n\n        def __call__(self):\n            portal = api.portal.get()\n\n            # Import content\n            view = api.content.get_view(\"import_content\", portal, request)\n            request.form[\"form.submitted\"] = True\n            request.form[\"commit\"] = 500\n            view(server_file=\"Plone.json\", return_json=True)\n            transaction.commit()\n\n            # Run all other imports\n            other_imports = [\n                \"relations\",\n                \"members\",\n                \"translations\",\n                \"localroles\",\n                \"ordering\",\n                \"defaultpages\",\n                \"discussion\",\n                \"portlets\",\n                \"redirects\",\n            ]\n            cfg = getConfiguration()\n            directory = Path(cfg.clienthome) / \"import\"\n            for name in other_imports:\n                view = api.content.get_view(f\"import_{name}\", portal, request)\n                path = Path(directory) / f\"export_{name}.json\"\n                results = view(jsonfile=path.read_text(), return_json=True)\n                logger.info(results)\n                transaction.commit()\n\n            # Run cleanup steps\n            results = fix_html_in_content_fields()\n            logger.info(\"Fixed html for %s content items\", results)\n            transaction.commit()\n\n            results = fix_html_in_portlets()\n            logger.info(\"Fixed html for %s portlets\", results)\n            transaction.commit()\n\n            reset_dates = api.content.get_view(\"reset_dates\", portal, request)\n            reset_dates()\n            transaction.commit()\n\n.. note::\n\n    The views ``@@export_all`` and ``@@import_all`` are also contained in the helper-packages https://github.com/starzel/contentexport and https://github.com/starzel/contentimport\n\nFAQ, Tips and Tricks\n====================\n\nThis section covers frequent use-cases and examples for features that are not required for all migrations.\n\nUsing global_obj_hook during export\n-----------------------------------\n\nUsing ``global_obj_hook`` during export to inspect content and decide to skip it.\n\n.. code-block:: python\n\n    def global_obj_hook(self, obj):\n        # Drop subtopics\n        if obj.portal_type == \"Topic\" and obj.__parent__.portal_type == \"Topic\":\n            return\n\n        # Drop files and images from PFG formfolders\n        if obj.__parent__.portal_type == \"FormFolder\":\n            return\n        return obj\n\n\nUsing dict-hooks during export\n------------------------------\n\nUse ``global_dict_hook`` during export to inspect content and modify the serialized json.\nYou can also use ``dict_hook_<somecontenttype>`` to better structure your code for readability.\n\nSometimes you need to handle data that you add in ``global_dict_hook`` during export in corresponding code in ``global_object_hook`` during import.\n\nThe following example about placeful workflow policy is a perfect example for that pattern:\n\n\nExport/Import placeful workflow policy\n--------------------------------------\n\nExport:\n\n.. code-block:: python\n\n    def global_dict_hook(self, item, obj):\n        if obj.isPrincipiaFolderish and \".wf_policy_config\" in obj.keys():\n            wf_policy = obj[\".wf_policy_config\"]\n            item[\"exportimport.workflow_policy\"] = {\n                \"workflow_policy_below\": wf_policy.workflow_policy_below,\n                \"workflow_policy_in\": wf_policy.workflow_policy_in,\n            }\n        return item\n\nImport:\n\n.. code-block:: python\n\n    def global_obj_hook(self, obj, item):\n        wf_policy = item.get(\"exportimport.workflow_policy\")\n        if wf_policy:\n            obj.manage_addProduct[\"CMFPlacefulWorkflow\"].manage_addWorkflowPolicyConfig()\n            wf_policy_config = obj[\".wf_policy_config\"]\n            wf_policy_config.setPolicyIn(wf_policy[\"workflow_policy_in\"], update_security=True)\n            wf_policy_config.setPolicyBelow(wf_policy[\"workflow_policy_below\"], update_security=True)\n\n\nUsing dict-hooks during import\n------------------------------\n\nA lot of fixes can be done during import using the ``global_dict_hook`` or ``dict_hook_<contenttype>``.\n\nHere we prevent the expire-date to be before the effective date since that would lead to validation-errors during deserializing:\n\n.. code-block:: python\n\n    def global_dict_hook(self, item):\n        effective = item.get('effective', None)\n        expires = item.get('expires', None)\n        if effective and expires and expires <= effective:\n            item.pop('expires')\n        return item\n\nHere we drop empty lines from the creators:\n\n.. code-block:: python\n\n    def global_dict_hook(self, item):\n        item[\"creators\"] = [i for i in item.get(\"creators\", []) if i]\n        return item\n\nThis example migrates a ``PloneHelpCenter`` to a simple folder/document structure during import.\nThere are a couple more types to handle (as folder or document) but you get the idea, don't you?\n\n.. code-block:: python\n\n    def dict_hook_helpcenter(self, item):\n        item[\"@type\"] = \"Folder\"\n        item[\"layout\"] = \"listing_view\"\n        return item\n\n    def dict_hook_helpcenterglossary(self, item):\n        item[\"@type\"] = \"Folder\"\n        item[\"layout\"] = \"listing_view\"\n        return item\n\n    def dict_hook_helpcenterinstructionalvideo(self, item):\n        item[\"@type\"] = \"File\"\n        if item.get(\"video_file\"):\n            item[\"file\"] = item[\"video_file\"]\n        return item\n\n    def dict_hook_helpcenterlink(self, item):\n        item[\"@type\"] = \"Link\"\n        item[\"remoteUrl\"] = item.get(\"url\", None)\n        return item\n\n    def dict_hook_helpcenterreferencemanualpage(self, item):\n        item[\"@type\"] = \"Document\"\n        return item\n\nIf you change types during import you need to take care of other cases where types are referenced.\\\nExamples are collection-queries (see \"Fixing invalid collection queries\" below) or constrains (see here):\n\n.. code-block:: python\n\n    PORTAL_TYPE_MAPPING = {\n        \"Topic\": \"Collection\",\n        \"FormFolder\": \"EasyForm\",\n        \"HelpCenter\": \"Folder\",\n    }\n\n    def global_dict_hook(self, item):\n        if item.get(\"exportimport.constrains\"):\n            types_fixed = []\n            for portal_type in item[\"exportimport.constrains\"][\"locally_allowed_types\"]:\n                if portal_type in PORTAL_TYPE_MAPPING:\n                    types_fixed.append(PORTAL_TYPE_MAPPING[portal_type])\n                elif portal_type in ALLOWED_TYPES:\n                    types_fixed.append(portal_type)\n            item[\"exportimport.constrains\"][\"locally_allowed_types\"] = list(set(types_fixed))\n\n            types_fixed = []\n            for portal_type in item[\"exportimport.constrains\"][\"immediately_addable_types\"]:\n                if portal_type in PORTAL_TYPE_MAPPING:\n                    types_fixed.append(PORTAL_TYPE_MAPPING[portal_type])\n                elif portal_type in ALLOWED_TYPES:\n                    types_fixed.append(portal_type)\n            item[\"exportimport.constrains\"][\"immediately_addable_types\"] = list(set(types_fixed))\n        return item\n\n\nChange workflow\n---------------\n\n.. code-block:: python\n\n    REVIEW_STATE_MAPPING = {\n        \"internal\": \"published\",\n        \"internally_published\": \"published\",\n        \"obsolete\": \"private\",\n        \"hidden\": \"private\",\n    }\n\n    def global_dict_hook(self, item):\n        if item.get(\"review_state\") in REVIEW_STATE_MAPPING:\n            item[\"review_state\"] = REVIEW_STATE_MAPPING[item[\"review_state\"]]\n        return item\n\n\nExport/Import Annotations\n-------------------------\n\nSome core-features of Plone (e.g. comments) use annotations to store data.\nThe core features are already covered but your custom code or community add-ons may use annotations as well.\nHere is how you can migrate them.\n\n**Export**: Only export those Annotations that your really need.\n\n.. code-block:: python\n\n    from zope.annotation.interfaces import IAnnotations\n    ANNOTATIONS_TO_EXPORT = [\n        \"syndication_settings\",\n    ]\n    ANNOTATIONS_KEY = 'exportimport.annotations'\n\n    class CustomExportContent(ExportContent):\n\n        def global_dict_hook(self, item, obj):\n            item = self.export_annotations(item, obj)\n            return item\n\n        def export_annotations(self, item, obj):\n            results = {}\n            annotations = IAnnotations(obj)\n            for key in ANNOTATIONS_TO_EXPORT:\n                data = annotations.get(key)\n                if data:\n                    results[key] = IJsonCompatible(data, None)\n            if results:\n                item[ANNOTATIONS_KEY] = results\n            return item\n\n**Import**:\n\n.. code-block:: python\n\n    from zope.annotation.interfaces import IAnnotations\n    ANNOTATIONS_KEY = \"exportimport.annotations\"\n\n    class CustomImportContent(ImportContent):\n\n        def global_obj_hook(self, obj, item):\n            item = self.import_annotations(obj, item)\n            return item\n\n        def import_annotations(self, obj, item):\n            annotations = IAnnotations(obj)\n            for key in item.get(ANNOTATIONS_KEY, []):\n                annotations[key] = item[ANNOTATIONS_KEY][key]\n            return item\n\nSome features also store data in annotations on the portal, e.g. `plone.contentrules.localassignments`, `plone.portlets.categoryblackliststatus`, `plone.portlets.contextassignments`, `syndication_settings`.\nDepending on your requirements you may want to export and import those as well.\n\n\nExport/Import Marker Interfaces\n-------------------------------\n\n**Export**: You may only want to export the marker-interfaces you need.\nIt is a good idea to inspect a list of all used marker interfaces in a portal before deciding what to migrate.\n\n.. code-block:: python\n\n    from zope.interface import directlyProvidedBy\n\n    MARKER_INTERFACES_TO_EXPORT = [\n        \"collective.easyslider.interfaces.ISliderPage\",\n        \"plone.app.layout.navigation.interfaces.INavigationRoot\",\n    ]\n    MARKER_INTERFACES_KEY = \"exportimport.marker_interfaces\"\n\n    class CustomExportContent(ExportContent):\n\n        def global_dict_hook(self, item, obj):\n            item = self.export_marker_interfaces(item, obj)\n            return item\n\n        def export_marker_interfaces(self, item, obj):\n            interfaces = [i.__identifier__ for i in directlyProvidedBy(obj)]\n            interfaces = [i for i in interfaces if i in MARKER_INTERFACES_TO_EXPORT]\n            if interfaces:\n                item[MARKER_INTERFACES_KEY] = interfaces\n            return item\n\n**Import**:\n\n.. code-block:: python\n\n    from plone.dexterity.utils import resolveDottedName\n    from zope.interface import alsoProvides\n\n    MARKER_INTERFACES_KEY = \"exportimport.marker_interfaces\"\n\n    class CustomImportContent(ImportContent):\n\n        def global_obj_hook_before_deserializing(self, obj, item):\n            \"\"\"Apply marker interfaces before deserializing.\"\"\"\n            for iface_name in item.pop(MARKER_INTERFACES_KEY, []):\n                try:\n                    iface = resolveDottedName(iface_name)\n                    if not iface.providedBy(obj):\n                        alsoProvides(obj, iface)\n                        logger.info(\"Applied marker interface %s to %s\", iface_name, obj.absolute_url())\n                except ModuleNotFoundError:\n                    pass\n            return obj, item\n\nSkip versioning during import\n-----------------------------\n\nThe event-handlers of versioning can seriously slow down your imports.\nIt is a good idea to skip it before the import:\n\n.. code-block:: python\n\n    VERSIONED_TYPES = [\n        \"Document\",\n        \"News Item\",\n        \"Event\",\n        \"Link\",\n    ]\n\n    def start(self):\n        self.items_without_parent = []\n        portal_types = api.portal.get_tool(\"portal_types\")\n        for portal_type in VERSIONED_TYPES:\n            fti = portal_types.get(portal_type)\n            behaviors = list(fti.behaviors)\n            if 'plone.versioning' in behaviors:\n                logger.info(f\"Disable versioning for {portal_type}\")\n                behaviors.remove('plone.versioning')\n            fti.behaviors = behaviors\n\nRe-enable versioning and create initial versions after all imports and fixes are done, e.g in the view ``@@import_all``.\n\n.. code-block:: python\n\n    from Products.CMFEditions.interfaces.IModifier import FileTooLargeToVersionError\n\n    VERSIONED_TYPES = [\n        \"Document\",\n        \"News Item\",\n        \"Event\",\n        \"Link\",\n    ]\n\n    class ImportAll(BrowserView):\n\n        # re-enable versioning\n        portal_types = api.portal.get_tool(\"portal_types\")\n        for portal_type in VERSIONED_TYPES:\n            fti = portal_types.get(portal_type)\n            behaviors = list(fti.behaviors)\n            if \"plone.versioning\" not in behaviors:\n                behaviors.append(\"plone.versioning\")\n                logger.info(f\"Enable versioning for {portal_type}\")\n            if \"plone.locking\" not in behaviors:\n                behaviors.append(\"plone.locking\")\n                logger.info(f\"Enable locking for {portal_type}\")\n            fti.behaviors = behaviors\n        transaction.get().note(\"Re-enabled versioning\")\n        transaction.commit()\n\n        # create initial version for all versioned types\n        logger.info(\"Creating initial versions\")\n        portal_repository = api.portal.get_tool(\"portal_repository\")\n        brains = api.content.find(portal_type=VERSIONED_TYPES)\n        total = len(brains)\n        for index, brain in enumerate(brains):\n            obj = brain.getObject()\n            try:\n                portal_repository.save(obj=obj, comment=\"Imported Version\")\n            except FileTooLargeToVersionError:\n                pass\n            if not index % 1000:\n                msg = f\"Created versions for {index} of {total} items.\"\n                logger.info(msg)\n                transaction.get().note(msg)\n                transaction.commit()\n        msg = \"Created initial versions\"\n        transaction.get().note(msg)\n        transaction.commit()\n\n\nDealing with validation errors\n------------------------------\n\nSometimes you get validation-errors during import because the data cannot be validated.\nThat can happen when options in a field are generated from content in the site.\nIn these cases you cannot be sure that all options already exist in the portal while importing the content.\n\nIt may also happen, when you have validators that rely on content or configuration that does not exist on import.\n\n.. note::\n\n    For relation fields this is not necessary since relations are imported after content anyway!\n\nThere are two ways to handle these issues:\n\n* Use a simple setter bypassing the validation used by the restapi\n* Defer the import until all other imports were run\n\n\nUse a simple setter\n*******************\n\nYou need to specify which content-types and fields you want to handle that way.\n\nIt is put in a key, that the normal import will ignore and set using ``setattr()`` before deserializing the rest of the data.\n\n.. code-block:: python\n\n    SIMPLE_SETTER_FIELDS = {\n        \"ALL\": [\"some_shared_field\"],\n        \"CollaborationFolder\": [\"allowedPartnerDocTypes\"],\n        \"DocType\": [\"automaticTransferTargets\"],\n        \"DPDocument\": [\"scenarios\"],\n        \"DPEvent\" : [\"Status\"],\n    }\n\n    class CustomImportContent(ImportContent):\n\n        def global_dict_hook(self, item):\n            simple = {}\n            for fieldname in SIMPLE_SETTER_FIELDS.get(\"ALL\", []):\n                if fieldname in item:\n                    value = item.pop(fieldname)\n                    if value:\n                        simple[fieldname] = value\n            for fieldname in SIMPLE_SETTER_FIELDS.get(item[\"@type\"], []):\n                if fieldname in item:\n                    value = item.pop(fieldname)\n                    if value:\n                        simple[fieldname] = value\n            if simple:\n                item[\"exportimport.simplesetter\"] = simple\n\n        def global_obj_hook_before_deserializing(self, obj, item):\n            \"\"\"Hook to modify the created obj before deserializing the data.\n            \"\"\"\n            # import simplesetter data before the rest\n            for fieldname, value in item.get(\"exportimport.simplesetter\", {}).items():\n                setattr(obj, fieldname, value)\n\n.. note::\n\n    Using ``global_obj_hook_before_deserializing`` makes sure that data is there when the event-handlers are run after import.\n\nDefer import\n************\n\nYou can also wait until all content is imported before setting the values on these fields.\nAgain you need to find out which fields for which types you want to handle that way.\n\nHere the data is stored in an annotation on the imported object from which it is later read.\nThis example also supports setting some data with ``setattr`` without validating it:\n\n.. code-block:: python\n\n    from plone.restapi.interfaces import IDeserializeFromJson\n    from zope.annotation.interfaces import IAnnotations\n    from zope.component import getMultiAdapter\n\n    DEFERRED_KEY = \"exportimport.deferred\"\n    DEFERRED_FIELD_MAPPING = {\n        \"talk\": [\"somefield\"],\n        \"speaker\": [\n            \"custom_field\",\n            \"another_field\",\n        ]\n    }\n    SIMPLE_SETTER_FIELDS = {\"custom_type\": [\"another_field\"]}\n\n    class CustomImportContent(ImportContent):\n\n        def global_dict_hook(self, item):\n            # Move deferred values to a different key to not deserialize.\n            # This could also be done during export.\n            item[DEFERRED_KEY] = {}\n            for fieldname in DEFERRED_FIELD_MAPPING.get(item[\"@type\"], []):\n                if item.get(fieldname):\n                    item[DEFERRED_KEY][fieldname] = item.pop(fieldname)\n            return item\n\n        def global_obj_hook(self, obj, item):\n            # Store deferred data in an annotation.\n            deferred = item.get(DEFERRED_KEY, {})\n            if deferred:\n                annotations = IAnnotations(obj)\n                annotations[DEFERRED_KEY] = {}\n                for key, value in deferred.items():\n                    annotations[DEFERRED_KEY][key] = value\n\nYou then need a new step in the migration to move the deferred values from the annotation to the field:\n\n.. code-block:: python\n\n    class ImportDeferred(BrowserView):\n\n        def __call__(self):\n            # This example reuses the form export_other.pt from collective.exportimport\n            self.title = \"Import deferred data\"\n            if not self.request.form.get(\"form.submitted\", False):\n                return self.index()\n            portal = api.portal.get()\n            self.results = []\n            for brain in api.content.find(DEFERRED_FIELD_MAPPING.keys()):\n                obj = brain.getObject()\n                self.import_deferred(obj)\n            api.portal.show_message(f\"Imported deferred data for {len(self.results)} items!\", self.request)\n\n        def import_deferred(self, obj):\n            annotations = IAnnotations(obj, {})\n            deferred = annotations.get(DEFERRED_KEY, None)\n            if not deferred:\n                return\n            # Shortcut for simple fields (e.g. storing strings, uuids etc.)\n            for fieldname in SIMPLE_SETTER_FIELDS.get(obj.portal_type, []):\n                value = deferred.pop(fieldname, None)\n                if value:\n                    setattr(obj, fieldname, value)\n            if not deferred:\n                return\n            # This approach validates the values and converts more complex data\n            deserializer = getMultiAdapter((obj, self.request), IDeserializeFromJson)\n            try:\n                obj = deserializer(validate_all=False, data=deferred)\n            except Exception as e:\n                logger.info(\"Error while importing deferred data for %s\", obj.absolute_url(), exc_info=True)\n                logger.info(\"Data: %s\", deferred)\n            else:\n                self.results.append(obj.absolute_url())\n            # cleanup\n            del annotations[DEFERRED_KEY]\n\nThis additional view obviously needs to be registered:\n\n.. code-block:: text\n\n    <browser:page\n        name=\"import_deferred\"\n        for=\"zope.interface.Interface\"\n        class=\".import_content.ImportDeferred\"\n        template=\"export_other.pt\"\n        permission=\"cmf.ManagePortal\"\n        />\n\n\nHandle LinguaPlone content\n--------------------------\n\nExport:\n\n.. code-block:: python\n\n    def global_dict_hook(self, item, obj):\n        # Find language of the nearest parent with a language\n        # Usefull for LinguaPlone sites where some content is languageindependent\n        parent = obj.__parent__\n        for ancestor in parent.aq_chain:\n            if IPloneSiteRoot.providedBy(ancestor):\n                # keep language for root content\n                nearest_ancestor_lang = item[\"language\"]\n                break\n            if getattr(ancestor, \"getLanguage\", None) and ancestor.getLanguage():\n                nearest_ancestor_lang = ancestor.getLanguage()\n                item[\"parent\"][\"language\"] = nearest_ancestor_lang\n                break\n\n        # This forces \"wrong\" languages to the nearest parents language\n        if \"language\" in item and item[\"language\"] != nearest_ancestor_lang:\n            logger.info(u\"Forcing %s (was %s) for %s %s \", nearest_ancestor_lang, item[\"language\"], item[\"@type\"], item[\"@id\"])\n            item[\"language\"] = nearest_ancestor_lang\n\n        # set missing language\n        if not item.get(\"language\"):\n            item[\"language\"] = nearest_ancestor_lang\n\n        # add info on translations to help find the right container\n        # usually this idone by export_translations\n        # but when migrating from LP to pam you sometimes want to check the\n        # tranlation info during import\n        if getattr(obj.aq_base, \"getTranslations\", None) is not None:\n            translations = obj.getTranslations()\n            if translations:\n                item[\"translation\"] = {}\n                for lang in translations:\n                    uuid = IUUID(translations[lang][0], None)\n                    if uuid == item[\"UID\"]:\n                        continue\n                    translation = translations[lang][0]\n                    if not lang:\n                        lang = \"no_language\"\n                    item[\"translation\"][lang] = translation.absolute_url()\n\nImport:\n\n.. code-block:: python\n\n    def global_dict_hook(self, item):\n\n        # Adapt this to your site\n        languages = [\"en\", \"fr\", \"de\"]\n        default_language = \"en\"\n        portal_id = \"Plone\"\n\n        # No language => lang of parent or default\n        if item.get(\"language\") not in languages:\n            if item[\"parent\"].get(\"language\"):\n                item[\"language\"] = item[\"parent\"][\"language\"]\n            else:\n                item[\"language\"] = default_language\n\n        lang = item[\"language\"]\n\n        if item[\"parent\"].get(\"language\") != item[\"language\"]:\n            logger.debug(f\"Inconsistent lang: item is {lang}, parent is {item['parent'].get('language')} for {item['@id']}\")\n\n        # Move item to the correct language-root-folder\n        # This is only relevant for items in the site-root.\n        # Most items containers are usually looked up by the uuid of the old parent\n        url = item[\"@id\"]\n        parent_url = item[\"parent\"][\"@id\"]\n\n        url = url.replace(f\"/{portal_id}/\", f\"/{portal_id}/{lang}/\", 1)\n        parent_url = parent_url.replace(f\"/{portal_id}\", f\"/{portal_id}/{lang}\", 1)\n\n        item[\"@id\"] = url\n        item[\"parent\"][\"@id\"] = parent_url\n\n        return item\n\nAlternative ways to handle items without parent\n-----------------------------------------------\n\nOften it is better to export and log items for which no container could be found instead of re-creating the original structure.\n\n.. code-block:: python\n\n    def update(self):\n        self.items_without_parent = []\n\n    def create_container(self, item):\n        # Override create_container to never create parents\n        self.items_without_parent.append(item)\n\n    def finish(self):\n        # export content without parents\n        if self.items_without_parent:\n            data = json.dumps(self.items_without_parent, sort_keys=True, indent=4)\n            number = len(self.items_without_parent)\n            cfg = getConfiguration()\n            filename = 'content_without_parent.json'\n            filepath = os.path.join(cfg.clienthome, filename)\n            with open(filepath, 'w') as f:\n                f.write(data)\n            msg = u\"Saved {} items without parent to {}\".format(number, filepath)\n            logger.info(msg)\n            api.portal.show_message(msg, self.request)\n\n\nExport/Import Zope Users\n------------------------\n\nBy default only users and groups stores in Plone are exported/imported.\nYou can export/import Zope user like this.\n\n**Export**\n\n.. code-block:: python\n\n    from collective.exportimport.export_other import BaseExport\n    from plone import api\n\n    import six\n\n    class ExportZopeUsers(BaseExport):\n\n        AUTO_ROLES = [\"Authenticated\"]\n\n        def __call__(self, download_to_server=False):\n            self.title = \"Export Zope users\"\n            self.download_to_server = download_to_server\n            portal = api.portal.get()\n            app = portal.__parent__\n            self.acl = app.acl_users\n            self.pms = api.portal.get_tool(\"portal_membership\")\n            data = self.all_zope_users()\n            self.download(data)\n\n        def all_zope_users(self):\n            results = []\n            for user in self.acl.searchUsers():\n                data = self._getUserData(user[\"userid\"])\n                data['title'] = user['title']\n                results.append(data)\n            return results\n\n        def _getUserData(self, userId):\n            member = self.pms.getMemberById(userId)\n            roles = [\n                role\n                for role in member.getRoles()\n                if role not in self.AUTO_ROLES\n            ]\n            # userid, password, roles\n            props = {\n                \"username\": userId,\n                \"password\": json_compatible(self._getUserPassword(userId)),\n                \"roles\": json_compatible(roles),\n            }\n            return props\n\n        def _getUserPassword(self, userId):\n            users = self.acl.users\n            passwords = users._user_passwords\n            password = passwords.get(userId, \"\")\n            return password\n\n**Import**:\n\n.. code-block:: python\n\n    class ImportZopeUsers(BrowserView):\n\n        def __call__(self, jsonfile=None, return_json=False):\n            if jsonfile:\n                self.portal = api.portal.get()\n                status = \"success\"\n                try:\n                    if isinstance(jsonfile, str):\n                        return_json = True\n                        data = json.loads(jsonfile)\n                    elif isinstance(jsonfile, FileUpload):\n                        data = json.loads(jsonfile.read())\n                    else:\n                        raise (\"Data is neither text nor upload.\")\n                except Exception as e:\n                    status = \"error\"\n                    logger.error(e)\n                    api.portal.show_message(\n                        u\"Failure while uploading: {}\".format(e),\n                        request=self.request,\n                    )\n                else:\n                    members = self.import_members(data)\n                    msg = u\"Imported {} members\".format(members)\n                    api.portal.show_message(msg, self.request)\n                if return_json:\n                    msg = {\"state\": status, \"msg\": msg}\n                    return json.dumps(msg)\n\n            return self.index()\n\n        def import_members(self, data):\n            app = self.portal.__parent__\n            acl = app.acl_users\n            counter = 0\n            for item in data:\n                username = item[\"username\"]\n                password = item.pop(\"password\")\n                roles = item.pop(\"roles\", [])\n                if not username or not password or not roles:\n                    continue\n                title = item.pop(\"title\", None)\n                acl.users.addUser(username, title, password)\n                for role in roles:\n                    acl.roles.assignRoleToPrincipal(role, username)\n                counter += 1\n            return counter\n\n\nExport/Import properties, registry-settings and installed add-ons\n-----------------------------------------------------------------\n\nWhen you migrate multiple similar sites that are configured manually it can be useful to export and import configuration that was set by hand.\n\nExport/Import installed settings and add-ons\n********************************************\n\nThis custom export exports and imports some selected settings and add-ons from a Plone 4.3 site.\n\n**Export:**\n\n.. code-block:: python\n\n    from collective.exportimport.export_other import BaseExport\n    from logging import getLogger\n    from plone import api\n    from plone.restapi.serializer.converters import json_compatible\n\n    logger = getLogger(__name__)\n\n\n    class ExportSettings(BaseExport):\n        \"\"\"Export various settings for haiku sites\n        \"\"\"\n\n        def __call__(self, download_to_server=False):\n            self.title = \"Export installed add-ons various settings\"\n            self.download_to_server = download_to_server\n            if not self.request.form.get(\"form.submitted\", False):\n                return self.index()\n\n            data = self.export_settings()\n            self.download(data)\n\n        def export_settings(self):\n            results = {}\n            addons = []\n            qi = api.portal.get_tool(\"portal_quickinstaller\")\n            for product in qi.listInstalledProducts():\n                if product[\"id\"].startswith(\"myproject.\"):\n                    addons.append(product[\"id\"])\n            results[\"addons\"] = addons\n\n            portal = api.portal.get()\n            registry = {}\n            registry[\"plone.email_from_name\"] = portal.getProperty('email_from_name', '')\n            registry[\"plone.email_from_address\"] = portal.getProperty('email_from_address', '')\n            registry[\"plone.smtp_host\"] = getattr(portal.MailHost, 'smtp_host', '')\n            registry[\"plone.smtp_port\"] = int(getattr(portal.MailHost, 'smtp_port', 25))\n            registry[\"plone.smtp_userid\"] = portal.MailHost.get('smtp_user_id')\n            registry[\"plone.smtp_pass\"] = portal.MailHost.get('smtp_pass')\n            registry[\"plone.site_title\"] = portal.title\n\n            portal_properties = api.portal.get_tool(\"portal_properties\")\n            iprops = portal_properties.imaging_properties\n            registry[\"plone.allowed_sizes\"] = iprops.getProperty('allowed_sizes')\n            registry[\"plone.quality\"] = iprops.getProperty('quality')\n            site_props = portal_properties.site_properties\n            if site_props.hasProperty(\"webstats_js\"):\n                registry[\"plone.webstats_js\"] = site_props.webstats_js\n            results[\"registry\"] = json_compatible(registry)\n            return results\n\n\n**Import:**\n\nThe import installs the add-ons and load the settings in the registry.\nSince Plone 5 ``portal_properties`` is no longer used.\n\n.. code-block:: python\n\n    from logging import getLogger\n    from plone import api\n    from plone.registry.interfaces import IRegistry\n    from Products.CMFPlone.utils import get_installer\n    from Products.Five import BrowserView\n    from zope.component import getUtility\n    from ZPublisher.HTTPRequest import FileUpload\n\n    import json\n\n    logger = getLogger(__name__)\n\n    class ImportSettings(BrowserView):\n        \"\"\"Import various settings\"\"\"\n\n        def __call__(self, jsonfile=None, return_json=False):\n            if jsonfile:\n                self.portal = api.portal.get()\n                status = \"success\"\n                try:\n                    if isinstance(jsonfile, str):\n                        return_json = True\n                        data = json.loads(jsonfile)\n                    elif isinstance(jsonfile, FileUpload):\n                        data = json.loads(jsonfile.read())\n                    else:\n                        raise (\"Data is neither text nor upload.\")\n                except Exception as e:\n                    status = \"error\"\n                    logger.error(e)\n                    api.portal.show_message(\n                        \"Failure while uploading: {}\".format(e),\n                        request=self.request,\n                    )\n                else:\n                    self.import_settings(data)\n                    msg = \"Imported addons and settings\"\n                    api.portal.show_message(msg, self.request)\n                if return_json:\n                    msg = {\"state\": status, \"msg\": msg}\n                    return json.dumps(msg)\n\n            return self.index()\n\n        def import_settings(self, data):\n            installer = get_installer(self.context)\n            for addon in data[\"addons\"]:\n                if not installer.is_product_installed(addon) and installer.is_product_installable(addon):\n                    installer.install_product(addon)\n                    logger.info(f\"Installed addon {addon}\")\n            registry = getUtility(IRegistry)\n            for key, value in data[\"registry\"].items():\n                registry[key] = value\n                logger.info(f\"Imported record {key}: {value}\")\n\n\nExport/Import registry settings\n*******************************\n\nThe pull-request https://github.com/collective/collective.exportimport/pull/130 has views ``@@export_registry`` and ``@@import_registry``.\nThese views export and import registry records that do not use the default-setting specified in the schema for that registry record.\nThe export alone could also be usefull to figure out which settings were modified for a site.\n\nThat code will probably not be merged but you can use it in your own projects.\n\nMigrate PloneFormGen to Easyform\n--------------------------------\n\nTo be able to export PFG as easyform you should use the branch ``migration_features_1.x`` of ``collective.easyform`` in your old site.\nEasyform does not need to be installed, we only need the methods ``fields_model`` and ``actions_model``.\n\nExport:\n\n.. code-block:: python\n\n    def dict_hook_formfolder(self, item, obj):\n        item[\"@type\"] = \"EasyForm\"\n        item[\"is_folderish\"] = False\n\n        from collective.easyform.migration.fields import fields_model\n        from collective.easyform.migration.actions import actions_model\n\n        # this does most of the heavy lifting...\n        item[\"fields_model\"] = fields_model(obj)\n        item[\"actions_model\"] = actions_model(obj)\n\n        # handle thankspage\n        pfg_thankspage = obj.get(obj.getThanksPage(), None)\n        if pfg_thankspage:\n            item[\"thankstitle\"] = pfg_thankspage.title\n            item[\"thanksdescription\"] = pfg_thankspage.Description()\n            item[\"showAll\"] = pfg_thankspage.showAll\n            item[\"showFields\"] = pfg_thankspage.showFields\n            item[\"includeEmpties\"] = pfg_thankspage.includeEmpties\n            item[\"thanksPrologue\"] = json_compatible(pfg_thankspage.thanksPrologue.raw)\n            item[\"thanksEpilogue\"] = json_compatible(pfg_thankspage.thanksEpilogue.raw)\n\n        # optional\n        item[\"exportimport._inputStorage\"] = self.export_saved_data(obj)\n\n        # Drop some PFG fields no longer needed\n        obsolete_fields = [\n            \"layout\",\n            \"actionAdapter\",\n            \"checkAuthenticator\",\n            \"constrainTypesMode\",\n            \"location\",\n            \"thanksPage\",\n        ]\n        for key in obsolete_fields:\n            item.pop(key, None)\n\n        # optional: disable tabs for imported forms\n        item[\"form_tabbing\"] = False\n\n        # fix some custom validators\n        replace_mapping = {\n            \"request.form['\": \"request.form['form.widgets.\",\n            \"request.form.get('\": \"request.form.get('form.widgets.\",\n            \"member and member.id or ''\": \"member and member.getProperty('id', '') or ''\",\n        }\n\n        # fix overrides in actions and fields to use form.widgets.xyz instead of xyz\n        for schema in [\"actions_model\", \"fields_model\"]:\n            for old, new in replace_mapping.items():\n                if old in item[schema]:\n                    item[schema] = item[schema].replace(old, new)\n\n            # add your own fields if you have these issues...\n            for fieldname in [\n                \"email\",\n                \"replyto\",\n            ]:\n                if \"request/form/{}\".format(fieldname) in item[schema]:\n                    item[schema] = item[schema].replace(\"request/form/{}\".format(fieldname), \"python: request.form.get('form.widgets.{}')\".format(fieldname))\n\n        return item\n\n    def export_saved_data(self, obj):\n        actions = {}\n        for data_adapter in obj.objectValues(\"FormSaveDataAdapter\"):\n            data_adapter_name = data_adapter.getId()\n            actions[data_adapter_name] = {}\n            cols = data_adapter.getColumnNames()\n            column_count_mismatch = False\n            for idx, row in enumerate(data_adapter.getSavedFormInput()):\n                if len(row) != len(cols):\n                    column_count_mismatch = True\n                    logger.debug(\"Column count mismatch at row %s\", idx)\n                    continue\n                data = {}\n                for key, value in zip(cols, row):\n                    data[key] = json_compatible(value)\n                id_ = int(time() * 1000)\n                while id_ in actions[data_adapter_name]:  # avoid collisions during export\n                    id_ += 1\n                data[\"id\"] = id_\n                actions[data_adapter_name][id_] = data\n            if column_count_mismatch:\n                logger.info(\n                    \"Number of columns does not match for all rows. Some data were skipped in \"\n                    \"data adapter %s/%s\",\n                    \"/\".join(obj.getPhysicalPath()),\n                    data_adapter_name,\n                )\n        return actions\n\nImport exported ``PloneFormGen`` data into ``Easyform``:\n\n.. code-block:: python\n\n    def obj_hook_easyform(self, obj, item):\n        if not item.get(\"exportimport._inputStorage\"):\n            return\n        from collective.easyform.actions import SavedDataBTree\n        from persistent.mapping import PersistentMapping\n        if not hasattr(obj, '_inputStorage'):\n            obj._inputStorage = PersistentMapping()\n        for name, data in item[\"exportimport._inputStorage\"].items():\n            obj._inputStorage[name] = SavedDataBTree()\n            for key, row in data.items():\n                 obj._inputStorage[name][int(key)] = row\n\n\nExport and import collective.cover content\n------------------------------------------\n\nExport:\n\n.. code-block:: python\n\n    from collective.exportimport.serializer import get_dx_blob_path\n    from plone.app.textfield.value import RichTextValue\n    from plone.namedfile.file import NamedBlobImage\n    from plone.restapi.interfaces import IJsonCompatible\n    from z3c.relationfield import RelationValue\n    from zope.annotation.interfaces import IAnnotations\n\n    def global_dict_hook(self, item, obj):\n        item = self.handle_cover(item, obj)\n        return item\n\n    def handle_cover(self, item, obj):\n        if ICover.providedBy(obj):\n            item['tiles'] = {}\n            annotations = IAnnotations(obj)\n            for tile in obj.get_tiles():\n                annotation_key = 'plone.tiles.data.{}'.format(tile['id'])\n                annotation = annotations.get(annotation_key, None)\n                if annotation is None:\n                    continue\n                tile_data = self.serialize_tile(annotation)\n                tile_data['type'] = tile['type']\n                item['tiles'][tile['id']] = tile_data\n        return item\n\n    def serialize_tile(self, annotation):\n        data = {}\n        for key, value in annotation.items():\n            if isinstance(value, RichTextValue):\n                value = value.raw\n            elif isinstance(value, RelationValue):\n                value = value.to_object.UID()\n            elif isinstance(value, NamedBlobImage):\n                blobfilepath = get_dx_blob_path(value)\n                if not blobfilepath:\n                    continue\n                value = {\n                    \"filename\": value.filename,\n                    \"content-type\": value.contentType,\n                    \"size\": value.getSize(),\n                    \"blob_path\": blobfilepath,\n                }\n            data[key] = IJsonCompatible(value, None)\n        return data\n\nImport:\n\n.. code-block:: python\n\n    from collections import defaultdict\n    from collective.exportimport.import_content import get_absolute_blob_path\n    from plone.app.textfield.interfaces import IRichText\n    from plone.app.textfield.interfaces import IRichTextValue\n    from plone.namedfile.file import NamedBlobImage\n    from plone.namedfile.interfaces import INamedBlobImageField\n    from plone.tiles.interfaces import ITileType\n    from zope.annotation.interfaces import IAnnotations\n    from zope.component import getUtilitiesFor\n    from zope.schema import getFieldsInOrder\n\n    COVER_CONTENT = [\n        \"collective.cover.content\",\n    ]\n\n    def global_obj_hook(self, obj, item):\n        if item[\"@type\"] in COVER_CONTENT and \"tiles\" in item:\n            item = self.import_tiles(obj, item)\n\n    def import_tiles(self, obj, item):\n        RICHTEXT_TILES = defaultdict(list)\n        IMAGE_TILES = defaultdict(list)\n        for tile_name, tile_type in getUtilitiesFor(ITileType):\n            for fieldname, field in getFieldsInOrder(tile_type.schema):\n                if IRichText.providedBy(field):\n                    RICHTEXT_TILES[tile_name].append(fieldname)\n                if INamedBlobImageField.providedBy(field):\n                    IMAGE_TILES[tile_name].append(fieldname)\n\n        annotations = IAnnotations(obj)\n        prefix = \"plone.tiles.data.\"\n        for uid, tile in item[\"tiles\"].items():\n            # TODO: Maybe create all tiles that do not need to be defferred?\n            key = prefix + uid\n            tile_name = tile.pop(\"type\", None)\n            # first set raw data\n            annotations[key] = item[\"tiles\"][uid]\n            for fieldname in RICHTEXT_TILES.get(tile_name, []):\n                raw = annotations[key][fieldname]\n                if raw is not None and not IRichTextValue.providedBy(raw):\n                    annotations[key][fieldname] = RichTextValue(raw, \"text/html\", \"text/x-html-safe\")\n            for fieldname in IMAGE_TILES.get(tile_name, []):\n                data = annotations[key][fieldname]\n                if data is not None:\n                    blob_path = data.get(\"blob_path\")\n                    if not blob_path:\n                        continue\n\n                    abs_blob_path = get_absolute_blob_path(obj, blob_path)\n                    if not abs_blob_path:\n                        logger.info(\"Blob path %s for tile %s of %s %s does not exist!\", blob_path, tile, obj.portal_type, obj.absolute_url())\n                        continue\n                    # Determine the class to use: file or image.\n                    filename = data[\"filename\"]\n                    content_type = data[\"content-type\"]\n\n                    # Write the field.\n                    with open(abs_blob_path, \"rb\") as myfile:\n                        blobdata = myfile.read()\n                    image = NamedBlobImage(\n                        data=blobdata,\n                        contentType=content_type,\n                        filename=filename,\n                    )\n                    annotations[key][fieldname] = image\n        return item\n\n\nFixing invalid collection queries\n---------------------------------\n\nSome queries changes between Plone 4 and 5.\nThis fixes the issues.\n\nThe actual migration of topics to collections in ``collective.exportimport.serializer.SerializeTopicToJson`` does not (yet) take care of that.\n\n.. code-block:: python\n\n    class CustomImportContent(ImportContent):\n\n        def global_dict_hook(self, item):\n            if item[\"@type\"] in [\"Collection\", \"Topic\"]:\n                item = self.fix_query(item)\n\n        def fix_query(self, item):\n            item[\"@type\"] = \"Collection\"\n            query = item.pop(\"query\", [])\n            if not query:\n                logger.info(\"Drop item without query: %s\", item[\"@id\"])\n                return\n\n            fixed_query = []\n            indexes_to_fix = [\n                \"portal_type\",\n                \"review_state\",\n                \"Creator\",\n                \"Subject\",\n            ]\n            operator_mapping = {\n                # old -> new\n                \"plone.app.querystring.operation.selection.is\":\n                    \"plone.app.querystring.operation.selection.any\",\n                \"plone.app.querystring.operation.string.is\":\n                    \"plone.app.querystring.operation.selection.any\",\n            }\n\n            for crit in query:\n                if crit[\"i\"] == \"portal_type\" and len(crit[\"v\"]) > 30:\n                    # Criterion is all types\n                    continue\n\n                if crit[\"o\"].endswith(\"relativePath\") and crit[\"v\"] == \"..\":\n                    # relativePath no longer accepts ..\n                    crit[\"v\"] = \"..::1\"\n\n                if crit[\"i\"] in indexes_to_fix:\n                    for old_operator, new_operator in operator_mapping.items():\n                        if crit[\"o\"] == old_operator:\n                            crit[\"o\"] = new_operator\n\n                if crit[\"i\"] == \"portal_type\":\n                    # Some types may have changed their names\n                    fixed_types = []\n                    for portal_type in crit[\"v\"]:\n                        fixed_type = PORTAL_TYPE_MAPPING.get(portal_type, portal_type)\n                        fixed_types.append(fixed_type)\n                    crit[\"v\"] = list(set(fixed_types))\n\n                if crit[\"i\"] == \"review_state\":\n                    # Review states may have changed their names\n                    fixed_states = []\n                    for review_state in crit[\"v\"]:\n                        fixed_state = REVIEW_STATE_MAPPING.get(review_state, review_state)\n                        fixed_states.append(fixed_state)\n                    crit[\"v\"] = list(set(fixed_states))\n\n                if crit[\"o\"] == \"plone.app.querystring.operation.string.currentUser\":\n                    crit[\"v\"] = \"\"\n\n                fixed_query.append(crit)\n            item[\"query\"] = fixed_query\n\n            if not item[\"query\"]:\n                logger.info(\"Drop collection without query: %s\", item[\"@id\"])\n                return\n            return item\n\n\nMigrate to Volto\n----------------\n\nYou can reuse the migration-code provided by ``@@migrate_to_volto`` in ``plone.volto`` in a migration.\nThe following example (used for migrating https://plone.org to Volto) can be used to migrate a site from any older version to Plone 6 with Volto.\n\nYou need to have the Blocks Conversion Tool (https://github.com/plone/blocks-conversion-tool) running that takes care of migrating richtext-values to Volto-blocks.\n\nSee https://6.docs.plone.org/backend/upgrading/version-specific-migration/migrate-to-volto.html for more details on the changes the migration to Volto does.\n\n\n.. code-block:: python\n\n    from App.config import getConfiguration\n    from bs4 import BeautifulSoup\n    from collective.exportimport.fix_html import fix_html_in_content_fields\n    from collective.exportimport.fix_html import fix_html_in_portlets\n    from contentimport.interfaces import IContentimportLayer\n    from logging import getLogger\n    from pathlib import Path\n    from plone import api\n    from plone.volto.browser.migrate_to_volto import migrate_richtext_to_blocks\n    from plone.volto.setuphandlers import add_behavior\n    from plone.volto.setuphandlers import remove_behavior\n    from Products.CMFPlone.utils import get_installer\n    from Products.Five import BrowserView\n    from zope.interface import alsoProvides\n\n    import requests\n    import transaction\n\n    logger = getLogger(__name__)\n\n    DEFAULT_ADDONS = []\n\n\n    class ImportAll(BrowserView):\n\n        def __call__(self):\n\n            request = self.request\n\n            # Check if Blocks-conversion-tool is running\n            headers = {\n                \"Accept\": \"application/json\",\n                \"Content-Type\": \"application/json\",\n            }\n            r = requests.post(\n                \"http://localhost:5000/html\", headers=headers, json={\"html\": \"<p>text</p>\"}\n            )\n            r.raise_for_status()\n\n            # Submit a simple form template to trigger the import\n            if not request.form.get(\"form.submitted\", False):\n                return self.index()\n\n            portal = api.portal.get()\n            alsoProvides(request, IContentimportLayer)\n\n            installer = get_installer(portal)\n            if not installer.is_product_installed(\"contentimport\"):\n                installer.install_product(\"contentimport\")\n\n            # install required add-ons\n            for addon in DEFAULT_ADDONS:\n                if not installer.is_product_installed(addon):\n                    installer.install_product(addon)\n\n            # Fake the target being a classic site even though plone.volto is installed...\n            # 1. Allow Folders and Collections (they are disabled in Volto by default)\n            portal_types = api.portal.get_tool(\"portal_types\")\n            portal_types[\"Collection\"].global_allow = True\n            portal_types[\"Folder\"].global_allow = True\n            # 2. Enable richtext behavior (otherwise no text will be imported)\n            for type_ in [\"Document\", \"News Item\", \"Event\"]:\n                add_behavior(type_, \"plone.richtext\")\n\n            transaction.commit()\n            cfg = getConfiguration()\n            directory = Path(cfg.clienthome) / \"import\"\n\n            # Import content\n            view = api.content.get_view(\"import_content\", portal, request)\n            request.form[\"form.submitted\"] = True\n            request.form[\"commit\"] = 500\n            view(server_file=\"Plone.json\", return_json=True)\n            transaction.commit()\n\n            # Run all other imports\n            other_imports = [\n                \"relations\",\n                \"members\",\n                \"translations\",\n                \"localroles\",\n                \"ordering\",\n                \"defaultpages\",\n                \"discussion\",\n                \"portlets\",  # not really useful in Volto\n                \"redirects\",\n            ]\n            for name in other_imports:\n                view = api.content.get_view(f\"import_{name}\", portal, request)\n                path = Path(directory) / f\"export_{name}.json\"\n                if path.exists():\n                    results = view(jsonfile=path.read_text(), return_json=True)\n                    logger.info(results)\n                    transaction.get().note(f\"Finished import_{name}\")\n                    transaction.commit()\n                else:\n                    logger.info(f\"Missing file: {path}\")\n\n            # Optional: Run html-fixers on richtext\n            fixers = [anchor_fixer]\n            results = fix_html_in_content_fields(fixers=fixers)\n            msg = \"Fixed html for {} content items\".format(results)\n            logger.info(msg)\n            transaction.get().note(msg)\n            transaction.commit()\n\n            results = fix_html_in_portlets()\n            msg = \"Fixed html for {} portlets\".format(results)\n            logger.info(msg)\n            transaction.get().note(msg)\n            transaction.commit()\n\n            view = api.content.get_view(\"updateLinkIntegrityInformation\", portal, request)\n            results = view.update()\n            msg = f\"Updated linkintegrity for {results} items\"\n            logger.info(msg)\n            transaction.get().note(msg)\n            transaction.commit()\n\n            # Rebuilding the catalog is necessary to prevent issues later on\n            catalog = api.portal.get_tool(\"portal_catalog\")\n            logger.info(\"Rebuilding catalog...\")\n            catalog.clearFindAndRebuild()\n            msg = \"Finished rebuilding catalog!\"\n            logger.info(msg)\n            transaction.get().note(msg)\n            transaction.commit()\n\n            # This uses the blocks-conversion-tool to migrate to blocks\n            logger.info(\"Start migrating richtext to blocks...\")\n            migrate_richtext_to_blocks()\n            msg = \"Finished migrating richtext to blocks\"\n            transaction.get().note(msg)\n            transaction.commit()\n\n            # Reuse the migration-form from plon.volto to do some more tasks\n            view = api.content.get_view(\"migrate_to_volto\", portal, request)\n            # Yes, wen want to migrate default pages\n            view.migrate_default_pages = True\n            view.slate = True\n            logger.info(\"Start migrating Folders to Documents...\")\n            view.do_migrate_folders()\n            msg = \"Finished migrating Folders to Documents!\"\n            transaction.get().note(msg)\n            transaction.commit()\n\n            logger.info(\"Start migrating Collections to Documents...\")\n            view.migrate_collections()\n            msg = \"Finished migrating Collections to Documents!\"\n            transaction.get().note(msg)\n            transaction.commit()\n\n            reset_dates = api.content.get_view(\"reset_dates\", portal, request)\n            reset_dates()\n            transaction.commit()\n\n            # Disallow folders and collections again\n            portal_types[\"Collection\"].global_allow = False\n            portal_types[\"Folder\"].global_allow = False\n\n            # Disable richtext behavior again\n            for type_ in [\"Document\", \"News Item\", \"Event\"]:\n                remove_behavior(type_, \"plone.richtext\")\n\n            return request.response.redirect(portal.absolute_url())\n\n\n    def anchor_fixer(text, obj=None):\n        \"\"\"Remove anchors since they are not supported by Volto yet\"\"\"\n        soup = BeautifulSoup(text, \"html.parser\")\n        for link in soup.find_all(\"a\"):\n            if not link.get(\"href\") and not link.text:\n                # drop empty links (e.g. anchors)\n                link.decompose()\n            elif not link.get(\"href\") and link.text:\n                # drop links without a href but keep the text\n                link.unwrap()\n        return soup.decode()\n\n\nMigrate very old Plone Versions with data created by collective.jsonify\n-----------------------------------------------------------------------\n\nVersions older than Plone 4 do not support ``plone.restapi`` which is required to serialize the content used by ``collective.exportimport``.\n\nTo migrate Plone 1, 2 and 3 to Plone 6 you can use ``collective.jsonify`` for the export and ``collective.exportimport`` for the import.\n\n\nExport with collective.jsonify\n******************************\n\nUse https://github.com/collective/collective.jsonify to export content.\n\nYou include the methods of ``collective.jsonify`` using `External Methods`.\nSee https://github.com/collective/collective.jsonify/blob/master/docs/install.rst for more info.\n\nTo work better with ``collective.exportimport`` you could extend the exported data using the feature ``additional_wrappers``.\nAdd info on the parent of an item to make it easier for ``collective.exportimport`` to import the data.\n\nHere is a full example for ``json_methods.py`` which should be in ``BUILDOUT_ROOT/parts/instance/Extensions/``\n\n.. code-block:: python\n\n    def extend_item(obj, item):\n        \"\"\"Extend to work better well with collective.exportimport\"\"\"\n        from Acquisition import aq_parent\n        parent = aq_parent(obj)\n        item[\"parent\"] = {\n            \"@id\": parent.absolute_url(),\n            \"@type\": getattr(parent, \"portal_type\", None),\n        }\n        if getattr(parent.aq_base, \"UID\", None) is not None:\n            item[\"parent\"][\"UID\"] = parent.UID()\n\n        return item\n\n\nHere is a full example for ``json_methods.py`` which should be in ``<BUILDOUT_ROOT>/parts/instance/Extensions/``\n\n.. code-block:: python\n\n    from collective.jsonify.export import export_content as export_content_orig\n    from collective.jsonify.export import get_item\n\n    EXPORTED_TYPES = [\n        \"Folder\",\n        \"Document\",\n        \"News Item\",\n        \"Event\",\n        \"Link\",\n        \"Topic\",\n        \"File\",\n        \"Image\",\n        \"RichTopic\",\n    ]\n\n    EXTRA_SKIP_PATHS = [\n        \"/Plone/archiv/\",\n        \"/Plone/do-not-import/\",\n    ]\n\n    # Path from which to continue the export.\n    # The export walks the whole site respecting the order.\n    # It will ignore everything untill this path is reached.\n    PREVIOUS = \"\"\n\n    def export_content(self):\n        return export_content_orig(\n            self,\n            basedir=\"/var/lib/zope/json\",\n            skip_callback=skip_item,\n            extra_skip_classname=[],\n            extra_skip_id=[],\n            extra_skip_paths=EXTRA_SKIP_PATHS,\n            batch_start=0,\n            batch_size=10000,\n            batch_previous_path=PREVIOUS or None,\n        )\n\n    def skip_item(item):\n        \"\"\"Return True if the item should be skipped\"\"\"\n        portal_type = getattr(item, \"portal_type\", None)\n        if portal_type not in EXPORTED_TYPES:\n            return True\n\n    def extend_item(obj, item):\n        \"\"\"Extend to work better well with collective.exportimport\"\"\"\n        from Acquisition import aq_parent\n        parent = aq_parent(obj)\n        item[\"parent\"] = {\n            \"@id\": parent.absolute_url(),\n            \"@type\": getattr(parent, \"portal_type\", None),\n        }\n        if getattr(parent.aq_base, \"UID\", None) is not None:\n            item[\"parent\"][\"UID\"] = parent.UID()\n\n        return item\n\nTo use these create three \"External Method\" in the ZMI root at the Zope root to use that:\n\n* id: \"export_content\", module name: \"json_methods\", function name: \"export_content\"\n* id: \"get_item\", module name: \"json_methods\", function name: \"get_item\"\n* id: \"extend_item\", module name: \"json_methods\", function name: \"extend_item\"\n\nThen you can pass the extender to the export using a query-string: http://localhost:8080/Plone/export_content?additional_wrappers=extend_item\n\n\nImport with collective.jsonify\n******************************\n\nTwo issues need to be dealt with to allow ``collective.exportimport`` to import the data generated by ``collective.jsonify``.\n\n#. The data is in directories instead of in one large json-file.\n#. The json is not in the expected format.\n\nStarting with version 1.8 you can pass an iterator to the import.\n\nYou need to create a directory-walker that sorts the json-files the right way.\nBy default it would import them in the order `1.json`, `10.json`, `100.json`, `101.json` and so on.\n\n.. code-block:: python\n\n    from pathlib import Path\n\n    def filesystem_walker(path=None):\n        root = Path(path)\n        assert(root.is_dir())\n        folders = sorted([i for i in root.iterdir() if i.is_dir() and i.name.isdecimal()], key=lambda i: int(i.name))\n        for folder in folders:\n            json_files = sorted([i for i in folder.glob(\"*.json\") if i.stem.isdecimal()], key=lambda i: int(i.stem))\n            for json_file in json_files:\n                logger.debug(\"Importing %s\", json_file)\n                item = json.loads(json_file.read_text())\n                item[\"json_file\"] = str(json_file)\n                item = prepare_data(item)\n                if item:\n                    yield item\n\nThe walker takes the path to be the root with one or more directories holding the json-files.\nThe sorting of the files is done using the number in the filename.\n\nThe method ``prepare_data`` modifies the data before passing it to the import.\nA very similar task is done by ``collective.exportimport`` during export.\n\n.. code-block:: python\n\n    def prepare_data(item):\n        \"\"\"modify jsonify data to work with c.exportimport\"\"\"\n\n        # Drop relationfields or defer the import\n        item.pop(\"relatedItems\", None)\n\n        mapping = {\n            # jsonify => exportimport\n            \"_uid\": \"UID\",\n            \"_type\": \"@type\",\n            \"_path\": \"@id\",\n            \"_layout\": \"layout\",\n            # AT fieldnames => DX fieldnames\n            \"excludeFromNav\": \"exclude_from_nav\",\n            \"allowDiscussion\": \"allow_discussion\",\n            \"subject\": \"subjects\",\n            \"expirationDate\": \"expires\",\n            \"effectiveDate\": \"effective\",\n            \"creation_date\": \"created\",\n            \"modification_date\": \"modified\",\n            \"startDate\": \"start\",\n            \"endDate\": \"end\",\n            \"openEnd\": \"open_end\",\n            \"eventUrl\": \"event_url\",\n            \"wholeDay\": \"whole_day\",\n            \"contactEmail\": \"contact_email\",\n            \"contactName\": \"contact_name\",\n            \"contactPhone\": \"contact_phone\",\n            \"imageCaption\": \"image_caption\",\n        }\n        for old, new in mapping.items():\n            item = migrate_field(item, old, new)\n\n        if item.get(\"constrainTypesMode\", None) == 1:\n            item = migrate_field(item, \"constrainTypesMode\", \"constrain_types_mode\")\n        else:\n            item.pop(\"locallyAllowedTypes\", None)\n            item.pop(\"immediatelyAddableTypes\", None)\n            item.pop(\"constrainTypesMode\", None)\n\n        if \"id\" not in item:\n            item[\"id\"] = item[\"_id\"]\n        return item\n\n\n    def migrate_field(item, old, new):\n        if item.get(old, _marker) is not _marker:\n            item[new] = item.pop(old)\n        return item\n\nYou can pass the generator ``filesystem_walker`` to the import:\n\n.. code-block:: python\n\n    class ImportAll(BrowserView):\n\n        def __call__(self):\n            # ...\n            cfg = getConfiguration()\n            directory = Path(cfg.clienthome) / \"import\"\n\n            # import content\n            view = api.content.get_view(\"import_content\", portal, request)\n            request.form[\"form.submitted\"] = True\n            request.form[\"commit\"] = 1000\n            view(iterator=filesystem_walker(directory / \"mydata\"))\n\n            # import default-pages\n            import_deferred = api.content.get_view(\"import_deferred\", portal, request)\n            import_deferred()\n\n\n    class ImportDeferred(BrowserView):\n\n        def __call__(self):\n            self.title = \"Import Deferred Settings (default pages)\"\n            if not self.request.form.get(\"form.submitted\", False):\n                return self.index()\n\n            for brain in api.content.find(portal_type=\"Folder\"):\n                obj = brain.getObject()\n                annotations = IAnnotations(obj)\n                if DEFERRED_KEY not in annotations:\n                    continue\n\n                default = annotations[DEFERRED_KEY].pop(\"_defaultpage\", None)\n                if default and default in obj:\n                    logger.info(\"Setting %s as default page for %s\", default, obj.absolute_url())\n                    obj.setDefaultPage(default)\n                if not annotations[DEFERRED_KEY]:\n                    annotations.pop(DEFERRED_KEY)\n            api.portal.show_message(\"Done\", self.request)\n            return self.index()\n\n``collective.jsonify`` puts the info on relations, translations and default-pages in the export-file.\nYou can use the approach to defer imports to deal with that data after all items were imported.\nThe example ``ImportDeferred`` above uses that approach to set the default pages.\n\nThis ``global_obj_hook`` below stores that data in a annotation:\n\n.. code-block:: python\n\n    def global_obj_hook(self, obj, item):\n        # Store deferred data in an annotation.\n        keys = [\"_defaultpage\"]\n        data = {}\n        for key in keys:\n            if value := item.get(key, None):\n                data[key] = value\n        if data:\n            annotations = IAnnotations(obj)\n            annotations[DEFERRED_KEY] = data\n\n\nTranslations\n============\n\nThis product has been translated into\n\n- Spanish\n\n\nContribute\n==========\n\n- Issue Tracker: https://github.com/collective/collective.exportimport/issues\n- Source Code: https://github.com/collective/collective.exportimport\n\n\nSupport\n-------\n\nIf you are having issues, please let us know.\n\n\nLicense\n-------\n\nThe project is licensed under the GPLv2.\n\n\nWritten by\n==========\n\n.. image:: ./docs/starzel.png\n    :target: https://www.starzel.de\n    :alt: Starzel.de\n\n\nContributors\n============\n\n- Philip Bauer, bauer@starzel.de\n\n- Maurits van Rees, m.van.rees@zestsoftware.nl\n\n- Fred van Dijk, f.van.dijk@zestsoftware.nl\n\n- Leonardo J. Caballero G., leonardocaballero@gmail.com\n\n\nChangelog\n=========\n\n\n1.12 (2024-03-08)\n-----------------\n\n- Fix migrating blocks to make Volto sites portable and support plone.distribution.\n  [pbauer, tlotze]\n\n\n1.11 (2024-02-28)\n-----------------\n\n- Fix ``AtributeError: 'NamedFile' object has no attribute '_blob'`` when using setting\n  \"Include blobs as blob paths\" and exporting objects with\n  plone.namedfile.file.NamedFile properties (so not blobs).\n  [valipod]\n\n- Add more Python 2 compatible version specifications and update the README.\n  [thet]\n\n- Fix ``KeyError: time`` when importing content with a workflow that does not have the ``time`` variable.\n  [maurits]\n\n- Allow to use fix_html_in_content_fields without applying the default html_fixer.\n  [pbauer]\n\n- Try to restore broken blobs when exporting content.\n  [thet]\n\n- When exporting into separate JSON files write also the error in a separate errors.json file.\n  This fixes an error at the end of the export and no errors being written.\n  [thet]\n\n- Add support for ATTopic export_content\n  [avoinea]\n\n- Add principals to groups that already exist during import (#228)\n  [pbauer]\n\n- In export_members ignore transitive membership of groups (#240)\n  [pbauer]\n\n\n1.10 (2023-10-11)\n-----------------\n\n- Don't re-use `mapping` variable when migrating portlet data.\n  [witsch]\n\n- Fix editing revision author - refs #216\n  [avoinea]\n\n- Better support for portal import which avoids parsing JSON twice.\n  [gotcha]\n\n- Migrate portlets on site root.\n  [ThibautBorn]\n\n- Support export & import to have one separate json-file per content item.\n  [pbauer]\n\n\n1.9 (2023-05-18)\n----------------\n\n- Allow passing custom filenames to exports\n  [pbauer]\n\n- Support export and import of Plone Site root (using update strategy).\n  [pbauer]\n\n- Fix blob export when Connection uses TmpStore\n  [gotcha, pbauer]\n\n- Fix portlet richtext field import\n  [mpeeters]\n\n- Add portlet location on exported data\n  [mpeeters]\n\n- Migrate root of portlets that used a path in plone4 to using a uid (navigation, search, events, collection).\n  [pbauer]\n\n- Make export of discussions and portlets contextual\n  [mpeeters]\n\n- Fix critical bug when importing groups: Do not import groups that a groups belongs to as members of the new group.\n  This could have caused groups to have more privileges than they should.\n  [pbauer]\n\n\n1.8 (2023-04-20)\n----------------\n\n- Import: run set_uuid method before we call custom hooks, so the hooks have access to\n  the item UUID. Fix #185.\n- Document COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY in README.\n  [fredvd]\n\n- Add Spanish translation.\n  [macagua]\n\n- Add i18n support.\n  [macagua]\n\n- Fix html: improve mapping from scale to picture variant.  [maurits]\n\n- Allow overriding the fallback variant in img_variant_fixer.\n  Use 'medium' by default.\n  [maurits]\n\n- Let fix_html view work on the current context.  [maurits]\n\n- Fix the way we get a blob path. (#180)\n  [ale-rt]\n\n- Create documents as containers for items without parent when documents are folderish.\n  [JeffersonBledsoe]\n\n- Add support for passing any iterator as data-source to the import.\n  [pbauer]\n\n- Add example for importing collective.jsonify data to documentation.\n  [pbauer]\n\n- Better serialization of Topics:\n  - Use newer criteria added in Plone 5\n  - Add fallback for some criteria\n  - Export sort_on and sort_reversed\n  - Export customView as tabular_view\n  [pbauer]\n\n- Always import discussions independent if discussion support is enabled or not\n  on a particular content object (#182)\n  [ajung]\n\n\n1.7 (2023-01-20)\n----------------\n\n- Filter out 'Discussion Item' in content type export list. Comments have their own export and\n  import views. A normal content type export for comments will raise a KeyError when trying to find\n  the parent. (#112)\n  [fredvd]\n\n- Be more specific in the import_translation endpoint condition to install in a site with p.a.multilingual 1.x\n  [erral]\n\n- Fix importing hidden portlets as visible. (#152)\n  [pbauer]\n\n- Use ``Language=all`` when querying TranslationGroup items\n  [erral]\n\n- Fix members import, by handling members that already exist.\n  [sunew]\n\n- Don't use new_id because a hook can change ``item[\"id\"]``\n  [pbauer]\n\n- Support exporting the blob-path without having access to the blobs.\n  [pbauer]\n\n- Set image-variants in html-fields when running @@fix_html targeting in Plone 6.\n  [pbauer]\n\n\n1.6 (2022-10-07)\n----------------\n\n- Export and import all group-members (including ldap-users and -groups).\n  Previously it only exported users and groups created in Plone.\n  [pbauer]\n\n- Support importing content without a UUID (e.g. for importing from an external source).\n  The minimal required data is @id, @type, id, and @parent[\"@id\"].\n  [pbauer]\n\n- Export only value when serializing vocabulary-based fields instead of token/title.\n  [pbauer]\n\n- Improve logging of errors during import.\n  [pbauer]\n\n- Add INCLUDE_PATHS to specify which paths only should be imported.\n  [pbauer]\n\n- Add import_review_state to allow overriding setting the review_state during import.\n  [pbauer]\n\n- Export parent UID and use it to find the container to import.\n  [pbauer]\n\n- Move the various export-hooks into update_export_data for readability.\n  [pbauer]\n\n- Support export to server by passing ``download_to_server=True`` for all exports (#115).\n  [pbauer]\n\n- Add support for adding custom html-fixers to fix_html_in_content_fields.\n  [pbauer]\n\n\n1.5 (2022-04-26)\n----------------\n\n- Fix AttributeError for getPhysicalPath when checking parent, issue 123.\n  [maurits]\n\n- Export and import redirection tool data.\n  [gotcha, Michael Penninck]\n\n- Serialize Products.TALESField fields as raw instead of evaluated expression.\n  (useful to export PFG overrides)\n  [sauzher]\n\n- Make sure we never change a acquired modification_date or creation_date.\n  [pbauer]\n\n- Export and import workflow_history.\n  [pbauer]\n\n- Fail gracefully on errors during importing portlets.\n  [pbauer]\n\n- Ignore containers where content should be imported to that are non-folderish.\n  [pbauer]\n\n- Use catalog instead of ZopeFindAndApply and better logging for export_discussion.\n  [pbauer]\n\n- Add converter for long ints (py2 only).\n  [pbauer]\n\n- By default no not export linkintegrity relations.\n  [pbauer]\n\n- Log detailed exception when exporting content fails.\n  [pbauer]\n\n- Add start and finish hooks for export of content.\n  [pbauer]\n\n- Rewrite export/import of default pages: Use uuid of default-page instead of id.\n  Rewrite getting default_page to fix various issues with translated content.\n  [pbauer]\n\n- Add export and import of versions/revisions of content (#105).\n  [pbauer]\n\n\n1.4 (2022-01-07)\n----------------\n\n- Fix ``debug`` flag in ``ExportRelations``\n  [petschki]\n\n- Deserialize portlet-data using restapi to fix importing RichText.\n  [pbauer]\n\n- Fix importing richtext with html-entities. Fixes #99\n  [pbauer]\n\n- Preserve links to browser-views by using a custom find_object. Fixes #97\n  [pbauer]\n\n- Ignore linkintegrity when importing items with replace-strategy.\n  [pbauer]\n\n- Add tests for fix_html.\n  [pbauer]\n\n\n1.3 (2021-12-08)\n----------------\n\n- Handle default page of the site root object.\n  [fulv]\n\n- Optionally (checkbox) skip existing content on import instead of generating it new with a randomized id.\n  [petschki]\n\n- Fix `UnboundLocalError` when calling `import_content` with `return_json` and `server_file`.\n  [petschki]\n\n- Add option to make a commit every x items.\n  [pbauer]\n\n- Improve logging during import in vairous cases.\n  [pbauer]\n\n- Work around case where api.content.get(path=parent_path) raises NotFound instead of returning None.\n  [pbauer]\n\n- Keep value of import_to_current_folder.\n  [pbauer]\n\n- Fix html unescape in py3.\n  [pbauer]\n\n- Fix serializing ATNewsItem image field content.\n  [gotcha]\n\n- Migrate eventUrl to event_url (AT to DX).\n  [ThibautBorn]\n\n- Log items that cannot be serialized instead of aborting the export.\n  [ThibautBorn]\n\n- Add a item_hook to export_localroles.\n  [ThibautBorn]\n\n- Fix handling of checkboxes for skip_existing_content and import_to_current_folder.\n  [pbauer]\n\n- Move intermediary commit code into commit_hook method to allow overriding.\n  [pbauer]\n\n- Add hook global_obj_hook_before_deserializing to modify the created obj before deserializing the data.\n  [pbauer]\n\n- Add support to update and to replace existing content during import (#76)\n  [pbauer]\n\n- Reindex permissions after importing local roles.\n  [pbauer]\n\n- Add export/import for constrains but import content without checking constrains or permissions (#71).\n  [pbauer]\n\n\n1.2 (2021-10-11)\n----------------\n\n- Prevent creating content in a different Plone Site in the same database (#52).\n  In general, cleanup parent paths when in development on localhost.\n  [maurits]\n\n- Read environment variable ``COLLECTIVE_EXPORTIMPORT_CENTRAL_DIRECTORY`` (#51).\n  When set, this is used for storing an export file and getting an import file.\n  This is useful for sharing content between multiple Plone Sites on the same server.\n  [maurits]\n\n- Unescape html entities and line-breaks when importing comments (#43).\n  [pbauer]\n\n- Export and import complete sites or content trees with configurable types, depth and path (#40).\n  [pbauer]\n\n- Added option to export blobs as blob paths (#50).\n  [pbauer, maurits]\n\n- Fixed creating missing folder structure (#45).\n  [maurits]\n\n- Export and import portlets (#39).\n  [pbauer]\n\n- Export content and write to file using a generator/yield. This avoids memory ballooning to the size of the exported file (#41).\n  [fredvd]\n\n\n1.1 (2021-08-02)\n----------------\n\n- Add option to import file from server.\n  [maurits]\n\n- Missing ``</form>`` closing tag in ``export_content.pt``\n  [petschki]\n\n- Support disabled aquisition of local roles during export/import of local roles.\n  [pbauer]\n\n- Use unrestrictedSearchResults to actually export all content.\n  [pbauer]\n\n- Add commit message after importing one type.\n  [pbauer]\n\n- Fix getting container for some cases.\n  [pbauer]\n\n- Fix use in Plone 4.3 without dexterity, zc.relation or plone.app.contenttypes.\n  [pbauer]\n\n- Fix @id of collections and parents of subcollections. Fix #30\n  [pbauer]\n\n- Fix use in Plone 4.3 with dexterity but without z3c.relationfield.\n  [maurits]\n\n- Add export and import for discussions/comments.\n  [pbauer]\n\n- Add option to fix collection queries after import.\n  [thomasmassmann]\n\n- Reset Creation Date. Fix #29\n  [pbauer]\n\n- Remove custom serializer for relations beacuse of ConfigurationConflictError with restapi.\n  Relations are dropped anyway in update_data_for_migration when using the default setting.\n  [pbauer]\n\n- Migrate batch size for topics.\n  [pbauer]\n\n- Fix issue of reusing the previous container when no container for a item could be found.\n  [pbauer]\n\n- Add hook self.finish() to do things after importing one file.\n  [pbauer]\n\n- Fix installation with older versions of setuptools (#35)\n  [pbauer]\n\n- Fix installation using pip (#36)\n  [ericof]\n\n- Do not constrain exportable FTIs to allow export of types as CalendarXFolder or ATTopic Criteria.\n  [pbauer]\n\n- Add hook self.start() to do things after importing one file.\n  [pbauer]\n\n\n1.0 (2021-04-27)\n----------------\n\n- Support setting values with ``factory_kwargs`` when creating instances during import.\n  This can be used to set values that need to be there during subscribers to IObjectAddedEvent.\n  [pbauer]\n\n\n1.0b1 (2021-03-26)\n------------------\n\n- Add option to save export on server.\n  [pbauer]\n\n- Fix issues in import_relations and import_ordering.\n  [pbauer]\n\n- Use links to other exports in export_content for easier override.\n  [pbauer]\n\n- Add support for exporting LinguaPlone translations.\n  [pbauer]\n\n\n1.0a2 (2021-03-11)\n------------------\n\n- Simplify package structure and remove all unneeded files\n  [pbauer]\n\n- Add export/import for position in parent\n  [pbauer]\n\n\n1.0a1 (2021-03-10)\n------------------\n\n- Initial release.\n  [pbauer]\n",
    "bugtrack_url": null,
    "license": "GPL version 2",
    "summary": "An add-on for Plone to Export and import content, members, relations, translations and localroles.",
    "version": "1.12",
    "project_urls": {
        "Documentation": "https://github.com/collective/collective.exportimport#readme",
        "Homepage": "https://github.com/collective/collective.exportimport",
        "PyPI": "https://pypi.python.org/pypi/collective.exportimport",
        "Source": "https://github.com/collective/collective.exportimport",
        "Tracker": "https://github.com/collective/collective.exportimport/issues"
    },
    "split_keywords": [
        "python",
        "plone",
        "cms"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a30663f8666c666105e353c11e9f21985c8d6f0d4b5158bba1f0b2ecad9ae003",
                "md5": "3d12826021e607c208da0ddb7fe6fad3",
                "sha256": "3ada9bf517523715568ddedc825f08f4c7f50ed805b7e689e5f2536535764c90"
            },
            "downloads": -1,
            "filename": "collective.exportimport-1.12-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3d12826021e607c208da0ddb7fe6fad3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*",
            "size": 132514,
            "upload_time": "2024-03-08T10:37:47",
            "upload_time_iso_8601": "2024-03-08T10:37:47.031951Z",
            "url": "https://files.pythonhosted.org/packages/a3/06/63f8666c666105e353c11e9f21985c8d6f0d4b5158bba1f0b2ecad9ae003/collective.exportimport-1.12-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a1aa40140443bf647e9ea87ed442cf15c6c10533bd96d92786fd2b9422897288",
                "md5": "81b87e5d3c2e544652d46df9f7d5677e",
                "sha256": "4f35d8426df696b13d23c87cc2a4767cd617f52fb8d0f1d00b5229a5cf329a3d"
            },
            "downloads": -1,
            "filename": "collective.exportimport-1.12.tar.gz",
            "has_sig": false,
            "md5_digest": "81b87e5d3c2e544652d46df9f7d5677e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*",
            "size": 391378,
            "upload_time": "2024-03-08T10:37:50",
            "upload_time_iso_8601": "2024-03-08T10:37:50.152597Z",
            "url": "https://files.pythonhosted.org/packages/a1/aa/40140443bf647e9ea87ed442cf15c6c10533bd96d92786fd2b9422897288/collective.exportimport-1.12.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-08 10:37:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "collective",
    "github_project": "collective.exportimport",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "zc.buildout",
            "specs": [
                [
                    "==",
                    "3.0.1"
                ]
            ]
        },
        {
            "name": "setuptools",
            "specs": [
                [
                    "<",
                    "67"
                ]
            ]
        }
    ],
    "tox": true,
    "lcname": "collective.exportimport"
}
        
Elapsed time: 0.20490s