contentACCESS documentation – version 4.2

  1. Introduction to contentACCESS
    1. Services provided by contentACCESS
    2. Software requirements
  2. contentACCESS setup package
    1. Installation of contentACCESS
      1. EULA
      2. Installation type
      3. Components
      4. Prerequisites
      5. Base folder
      6. Service settings
      7. Database connection
      8. contentACCESS Central Administration
      9. contentACCESS Web Services (Proxy)
      10. contentWEB
      11. Central login
      12. Virtual drive
      13. Search service
      14. SMTP server
      15. Overview
      16. Installation
      17. Summary
  3. contentACCESS components
    1. contentACCESS Central Administration
      1. Central administration login
      2. contentACCESS Automated single sign on
      3. Central Administration logout
      4. contentACCESS Central Administration user interface
    2. contentWEB
      1. Logging in to contentWEB
      2. contentWEB Automated single sign on
    3. Virtual drive
    4. contentACCESS Web Services (Proxy)
    5. Central login page
  4. contentACCESS Tools
    1. Installing Outlook forms
    2. Legacy email archive connectors
    3. Legacy archive connector for Metalogix Archive Manager Exchange Edition (MAM EE)
    4. Legacy archive connector for Email Lifecycle Manager (ELM)
    5. Installing TECH-ARROW’s WinShortcutter
    6. contentACCESS Outlook add-in
      1. Installation of contentACCESS Outlook add-in
      2. How to use contentACCESS Outlook add-in
  5. Tenants in contentACCESS
    1. How to create a new tenant
      1. How to edit and disable a tenant
    2. Tenant limitations
    3. How to provide access to a tenant (adding new tenant administrators)
    4. Tenant administrator invitation types
    5. Tenant associations
      1. Tenant - database association
      2. Tenant - user association
    6. Tenant deletion
  6. General system configurations
    1. Connection
    2. User interface
    3. Users in contentACCESS
    4. Invitations
    5. Roles
      1. Creating roles
      2. Role details
      3. Role assignment
      4. Defining specific permissions of a role assignment
      5. Editing roles, editing role assignments
      6. Role cloning
      7. General use cases of how to create/assign roles
      8. Managing access to contentACCESS objects
    6. Login providers
      1. Login providers’ context menu options
      2. External login provider configuration
        1. Configuring Google OAuth
        2. Configuring Office 365 login provider
        3. Exchange login provider
        4. External AD login provider
      3. Associating an enabled provider with a user login
      4. contentACCESS users in third party systems
    7. System
    8. Licensing
      1. How to activate your license key
    9. Notifications
    10. System logs — how to find out possible misconfigurations / reasons of potential system/job failures
    11. Configuration auditing
    12. Archive auditing
    13. Distributed environment in contentACCESS — Clusters
    14. Statistics
    15. Legal hold
    16. Task runner
    17. Indexing
    18. SMTP Servers
    19. SMTP Mappings
    20. How to create/configure databases — All databases
  7. Common features
    1. Databases
    2. Schedules
    3. Retentions
    4. Storages
      1. Google drive storage
      2. Amazon S3
    5. Exchange connections
      1. Exchange performance settings – turning off the Exchange throttling policies
      2. Mixed Exchange environments in the Email Archive system
    6. Importing contentACCESS configurations from files
      1. Manual import of Exchange servers/groups/mailboxes to the contentACCESS Address book
      2. Importing File Archive root folders to be archived
  8. Creating new jobs in contentACCESS
  9. Jobs’ page, jobs’ context menu
  10. Filtering in jobs
  11. File Archive
    1. Introduction to File system archive
    2. File archive settings
    3. File archive Databases
    4. File archive System settings
    5. File archive Retentions
    6. File archive Storages
    7. Root folders
    8. Aliases
    9. File archive Schedules
    10. Provisioning settings and managing access to contentWEB
      1. File system provisioning job description
    11. Remote agents (file archive)
    12. Global rules (remote file archive)
    13. Configuring aliases
    14. Configuration of jobs available in contentACCESS File Archive
    15. Configuration of File archive retention changer job
    16. Configuration of File system archive job
      1. File system archive job description
    17. Configuration of a File system restore job
      1. File system restore job description
    18. Configuration of File system recovery job
      1. File system recovery job description
    19. Configuration of Delete job in File archive
      1. File system delete job description
    20. Configuration of File system shortcut synchronization job
      1. File system shortcut synchronization job description
    21. Configuration of Remote shortcutting job
      1. File system remote shortcutting job description
    22. Active/inactive documents in File system archive
  12. Email Archive
    1. Important settings before creating an Email Archive job
    2. Database settings
    3. Email archive System settings
      1. Hybrid exchange settings
      2. Email archive registration on Azure portal for Modern authentication
    4. Email archive Provisioning settings
      1. Email archive provisioning job description
    5. Retention settings
    6. Shortcuts in email archiving
    7. Storing of archived emails
      1. LoboDMS storage
    8. Creating email archive schedulers
    9. User experience
      1. Exchange 2013+: Mail app in OWA 2013+ or on MS Outlook 2013+ desktop version
      2. Exchange 2010: OWA 2010 integration
    10. Address book objects
      1. Adding address book objects manually
      2. Removing groups and mailboxes from the Address book
    11. Granting access rights for mailbox users and explicit users to view the mailbox archive
      1. Creating contentWEB users (option 1)
      2. Manage access to a mailbox archive (option 2)
    12. Database and store assignment in email archiving
      1. How to assign database, storage and index zone to an Exchange group?
      2. How to assign database, storage and index zone to a mailbox?
      3. How to move data from source database/storage into a second (target) database/storage?
    13. Mail app access
    14. Remote agents (email archive)
    15. PST import
      1. PST import job description
    16. Creating Email archive jobs: archive, restore, recovery, delete, mailbox move, shortcut synchronizaion, shortcut repair
    17. Email archive job
      1. Email archive job configuration
      2. Email archive job description
      3. Email archive journal processing
    18. Email archive retention changer job
    19. Email restore job
      1. Email restore job configuration
      2. Email restore job description
    20. Email recovery job
      1. Email recovery job configuration
      2. Email recovery job description
    21. Configuration of Delete job in Email archive
      1. Email delete job description
    22. Journal post processing job
      1. Journal post processing job configuration
    23. Mailbox move job
      1. Mailbox move job configration
      2. Mailbox move job description
    24. Shortcut synchronization job
      1. Shortcut synchronization job configuration
      2. Email shortcut synchronization job description
    25. Shortcut repair job
      1. Shortcut repair job configuration
      2. Email shortcut repair job description
    26. Public folder archiving
      1. How to configure a job to archive public folders
      2. Public folders in the contentWEB archive
      3. User permissions to public folders
      4. Public Folder archiving in mixed Exchange environments
    27. SMTP archiving
  13. SharePoint archive plugin
    1. SharePoint Archive settings
    2. SharePoint Archive job configuration
      1. SharePoint archive job description
    3. SharePoint archive retention changer job configuration
    4. SharePoint recovery job configuration
      1. SharePoint recovery job description
    5. Configuration of Delete job in SharePoint archive
      1. SharePoint delete job description
    6. SharePoint archive Provisioning settings
      1. SharePoint provisioning job description
    7. SharePoint Publishing job
      1. SharePoint publishing job description
    8. SharePoint in the contentWEB archive
  14. GDPR plugin
    1. GDPR Settings
      1. GDPR Databases
      2. GDPR Schedules
      3. GDPR Index zones
    2. GDPR Processing
      1. GDPR File system settings
      2. GDPR Exchange settings
      3. GDPR Applications
      4. GDPR Jobs
        1. GDPR File system job
          1. GDPR file system job description
        2. GDPR Exchange job
          1. GDPR Exchange job description
        3. GDPR Application job
          1. GDPR application job description
  15. Teams archive
    1. Teams archive databases
    2. Teams archive System settings
    3. Teams archive Provisioning settings
    4. Teams archive Address book
      1. Removing objects from Teams archive Address book
    5. Teams archive Licensing
    6. Teams archive Jobs
      1. Teams archive job
      2. Teams chat archive job
  16. Custom plugins
    1. Email management job configuration
    2. Storage replication plugin
    3. Sharing plugin
    4. Datengut plugin
    5. Email synchronizer plugin
    6. Categorize to Public folders plugin
    7. LoboDMS plugin
  17. ThreatTest
    1. ThreatTest configuration
      1. ThreatTest Databases
      2. ThreatTest System settings
      3. ThreatTest Schedules
      4. ThreatTest User experience
      5. ThreatTest Statistics
      6. ThreatTest Job
    2. Using ThreatTest App
  18. officeGATE
  19. contentACCESS Mobile
  20. Virtual drive configurations
  21. Teams application
  22. Application settings
  23. Terms of use
  24. FAQ
    1. Download sample for the file to be imported does not work
    2. Archiving is not working if MAPI is set to communicate with the Exchange server
    3. Virtual drive is still appearing after the uninstall
    4. Outlook forms problems
    5. Unable to open shortcuts of archived files on the server side
    6. Samples are not shown using 'Show sample" option in the Import dialog
    7. Do I need to create separate tenants for file archiving and email archiving
    8. What is the recommended database size for email, file and Sharepoint archiving
    9. The TEMP folder is running out of space when archiving big files
    10. The attachment could not be opened
    11. After updating Exchange 2013, the EWS connection might not work in contentACCESS
    12. If Windows authentication is not working in contentACCESS and an alias was created for contentACCESS
    13. contentACCESS Outlook add-in certificate issue
    14. PowerShell scripts for setting up Email archive
    15. Solution for Outlook security patches
    16. Solution for Outlook security patches through GPO
    17. Solution for indexing PDF files
    18. O365 SuperUser mailbox configuration
    19. Office365 journaling
    20. Organizational forms
    21. Multifactor authentication
    22. Region setting
    23. contentACCESS Mail app installation issue
    24. Azure app registration

7.4.Storages

(Email Archive ⇒ Settings Storages button;
File Archive ⇒ Settings Storages button;
SharePoint Archive ⇒ Settings Retentions button;
Custom plugins ⇒ General Storages button)

For storage configurations, open the Storages page (navigate to Documentation192.1Storages button on your ribbon). The storages configured on this page can be selected as a destination for the processed binaries when configuring a certain contentACCESS job. contentACCESS supports Disk storage (most frequently used type), H&S Hybrid Store, Perceptive, Datengut storage etc. The table of storages is initially empty.

To configure a new storage, click on + new on the Storages page. The Storage repository window will open. Type in the Store name and select a Store type from the list. The required storage settings depend on the storage type that you have selected.

Configurations of the most frequently used storage types will be detailed in the following sections of this chapter:

Disk store type

This store type is used if the user would like to store the binaries on a single local or remote disk. This is the most frequently used store type from the types listed above. After this store type has been selected, fill in the Path (the target destination for the binaries) and enter credentials if required.

The user also has the possibility to decide if he wants to use the Compression function. With checking this checkbox, all files larger than 4 kilobytes will be compressed, except of already compressed file formats such as JPG, MP3 etc. This feature may slow down the store functionality, but on the other hand it will spare storage space.

It is also possible to enable SnapLock for disk store. To do this, check the Enable SnapLock checkbox – thanks to this the storage system can prevent deletion of the files until the specified retention is expired. This feature can be used for storage systems which support this function, for example NetApp.

If you wish to keep only one copy of the physical file in the store, check the Use single instancing checkbox.

To enhance the storage security, it is possible to store files in encrypted form in the disk storage by checking the Use file encryption checkbox. This way the files will be encrypted (using the AES encryption algorithm) and stored in one go. The encryption key must be stored in case of database failure, without it the stored files will be unreadable. Download it by clicking on the Download link in this section. If the user tries to save storage configuration without downloading the encryption key, system will warn about this.

Note: The file encryption option may be slowing down the storage functionality, both for archive and retrieve.

Under Database settings select a Database connection. This configuration will play an important role when using the Storage replication plugin. It is also possible to run a test control via Storage replication plugin. We recommend NOT to change the already configured database that the disk storage uses (however, it can be changed using the “Change” button if it is needed).

We also advise to verify the connection using the Test button. The test is checking if the right credentials have been specified, and if the connection with the database has been established.

When using Disk store type, please consider requirements for data deduplication. Data deduplication finds and removes duplication within data on a volume while ensuring that the data remains correct and complete. This makes it possible to store more file data in less space on the volume. Deduplication is not supported on:

System or boot volumes;
Remote mapped or remote mounted drives;
Cluster shared volume file system (CSVFS) for non-VDI workloads or any workloads on Windows Server 2012;
Files approaching or larger than 1 TB in size;
Volumes approaching or larger than 64 TB in size.

 

What to consider before a previously configured disk storage is changed?

In some cases, it may happen that the administrator needs to move the already configured storage to a new location (e.g. if “C” disk got full and the storage should be moved to disk “D”) In this case, the following steps must be executed to ensure an access to the already archived data and to continue with archiving the new data:

  1. Stop all running jobs!!!
  2. Move your old storage location manually into the new storage location (e.g. the folder of “C” disk into a new location on “D” disk);
  3. Open the Storages page on the contentACCESS Central Administration ribbon, locate your old storage in the list and double click on it to open the Storage repository window for that storage;
  4. In the Storage repository window specify the new storage path (where you moved your old storage location);


These settings will ensure, that the archive and restore jobs will automatically use this new storage path. The administrator will not be required to change the storage settings on the jobs’ configuration page, too. The new value will be automatically used by the jobs, where the “old” storage was already configured.

✓ HybridStore
contentACCESS supports the connection with the Hybrid Store. This connection allows contentACCESS to connect to any third-party storage that is supported by this store type. If you want to store the binaries in the Hybrid Store, select it from the Store type dropdown list and specify the required connection settings. The following storage settings are required:

Store name: optional name for the Hybrid Store that contentACCESS will use

Store type: HybridStore

Server name: the server where the Hybrid Store is installed

Binding: http – universal protocol

net.tcp –can be used only in case that the Hybrid Store and contentACCESS are in the same domain

net.pipe – the fastest and recommended protocol; can be used only in case that the Hybrid Store and contentACCESS are installed on the same machine

Use secure connection: Check this option to allow a secure connection with the Hybrid Store. The communication will be secured by a Windows authentication (the contentACCESS service user will be used).

Hybrid store tenant: Click on “Load” to load the list of available tenants based on your Hybrid Store configurations and select the one that should be applied. There are some requirements to load the tenants:

  • NET.TCP or NET.PIPE connection must be used (on HTTP the loading is not supported)
  • the contentACCESS service user must be a local system administrator on the HybridStore machine

If the tenants are not loaded, then the user needs to enter the HybridStore tenant ID (GUID) manually.

Scheme: Click on “Load” to load the available Hybrid Store schemes and select the one that should be applied. If Hybrid Store uses secure connection and the option is not checked in the dialog, the schemes will not be loaded.

Database settings: Select here an already created database that contentACCESS will use to store the necessary data. It is NOT recommended to change the already configured database that the Hybrid store uses. However, it can be changed using the “Change” button if it is really necessary.

It is recommended to verify the database connection using the “Test” button at the bottom of the dialog.

cACCESS 2.9 doc 95
✓ Datengut
This storage type is currently used when synchronizing emails in multiple mailboxes. For more information please refer to Email synchronizer plugin.

After this store type has been selected in Storage settings, name your storage. contentACCESS will use this name to display the storage on the Storages page. Further configure the following sections in the dialog:

Connection configuration: necessary settings to establish a connection with the Datengut storage

  • Endpoint URL: Datengut service URL
  • Api key: optional setting, any value can be entered here
  • Email archive default folder id: during the archiving process Datengut storage saves the data into a specific storage folder; the ID of this folder must be set here

User credentials: set the Username and Password that can be used to connect to the storage

Plugin configuration: The email synchronizer job(s) that are already created in Custom plugins ⇒ General ⇒ Jobs will be listed in the dropdown list. Select from the dropdown list the Email synchronizer job that will collect the metadata of the archived emails in a queue. This job is used to synchronize the email message categories in multiple mailboxes based on the metadata that are saved into its queue.

Note: For more information refer to chapter Email synchronizer plugin.

Database settings: select an already created database that Datengut storage will use to store the necessary data

cACCESS 2.9 doc 96
✓ Perceptive
After this store type has been selected, name your storage. contentACCESS will use this name for the given storage on the Storages page. The user is further required to specify the following connection parameters:

  • Server name: the server name where the storage is installed
  • CC library path: path on which the Classic Connector jar files can be reached, e.g. C:\Program Files\SAPERION\Application\scr\scr-classicconnector\lib
  • JAVA_HOME: JAVA home directory must be set depending on the application bitness, x64 or x86, e.g.: C:\Program Files\Java\jre1.8.0_66
  • Tenant: in case of non-multitenant system this fields is blank, otherwise the Perceptive tenant should be specified
  • Login type: “INDEX”, “ADMIN” or “ELM” can be used, the recommended type is “INDEX”
  • Authentication: choose the applicable authentication from the dropdown list; the recommended is “UserName”
  • User, Password: specify the applicable user and credentials to be applied for the connection with the storage
  • DDC: the DDC name where the files will be packed

cACCESS 2.9 doc 97

Kendox storage
Kendox storage can be used for various operations like email and file archiving, SharePoint archiving and publishing data into SharePoint libraries and custom lists as well. In the Storage repository window, select the Kendox store type, name your storage and configure the connection as follows:

Connection configuration section:

  • Endpoint URL: Kendox service URL
  • Culture: The names of folders, properties will be shown in the language specified here. The language can be a country-code (e.g. de) or both the language and country-code (e.g. de-CH).
Authentication configuration:

  • Username and password: Specify the Username and Password that can be applied to connect to the storage.
    Use empty password: Use empty password. If this is checked in, a password empty string is saved. Otherwise – if the username was not changed – the old password is saved.
  • Default folder path: During the archiving or publishing process the data will be saved/published from the folder, which is specified in the path.
  • Mobile Web Client URL: This configuration option needs to be set only in case that the storage is used in a SharePoint Publishing job, and the job is configured to create custom lists with links pointing to the source documents. The URL must be the Kendox Mobile Web Client’s URL, where the published documents can be accessed. The URL is used to generate the retrieve URL of the published document.

Import template configuration:

In the archiving process, import template is used to map the metadata between the source document and the newly created document in the Kendox store. Click the “Get available import templates” button to get the available import templates. This import template is only used if the target folder doesn’t have any default import template, otherwise it is ignored.

kendox-store-3-1

From the given storage’s context menu, on the Storages page, you can modify (Edit/Delete/Set default) the storage settings by clicking on ellipsis (…). The configurations set in the Storage repository window can be viewed in the grid.

store-manage-access-3-1

Note: The “+ new” option above the item list is unavailable for the logged on user, if the “Add repository items” permission on the tenant is not allowed in his role assignment.
The “manage access” option allows to grant access permissions on the selected storage for a second user. This “manage access” option is available for the logged on administrator, if his role assignment contains the Edit repository items – All allowed permission on the tenant. Read more in Managing access to contentACCESS objects.

✓ Azure storage
Azure Storage is a cloud storage solution. It also provides the storage foundation for Azure Virtual Machines and is accessible from anywhere in the world, from any type of application and any type of device. Azure storage is using blob store to store its metadata.

In case of the user having/using a German cloud (dedicated and isolated Microsoft Azure version for Germany), the Azure storage account name must contain the .core.cloudapi.de suffix. This is displayed in Azure configuration, when the user creates the storage.

Yes No Suggest edit
Help Guide Powered by Documentor
Suggest Edit