1. Introduction

The Data Gateway solution is a robust and versatile product crafted to streamline secure and efficient data exchange between trading partners, applications and cloud services. The Data Gateway, underpinned by the Spring Boot framework, functions as a central hub orchestrating authentication, file transfers, and cloud data transactions. This documentation serves as a comprehensive guide, offering insights into comprehending, implementing, and optimizing the multifaceted functionalities of the Data Gateway server. 

2. About Data Gateway

Data Gateway serves as a cornerstone for centralized and secure file transfers, ensuring the confidentiality and integrity of sensitive data. Tailored for Super Admins, Admins, and File Operators, this application simplifies the intricacies of secure file exchange. Distinguished by its support for widely used protocols like SFTP (SSH File Transfer Protocol), FTP (File Transfer Protocol), and FTPS (FTP Secure), the Data Gateway focuses on delivering seamless and secure data transmission. The gateway has Secure Component sitting in the DMZ and rest of the layers in the secure zone with no internet access. The secure components take care of the Forward and Reverse Proxy functionalities, and all the connections are securely transmitted. Organizations benefit from the flexibility to choose the most suitable and secure protocol tailored to their specific use cases. Explore the features, configurations, and best practices outlined in this documentation to harness the full potential of Data Gateway for your data exchange needs. 

3. Key Features

  1. Secure File Transfer: Utilize industry-standard protocols and cloud-native solutions for secure and encrypted file transfers. Encapsulated with Forward, Reverse Proxy and File Validation, Scanning capabilities. 
  2. Role-Based Access: Different access roles (Super Admin, Admin, File Operator) ensure controlled access and tailored functionalities. 
  3. Comprehensive Dashboard: Gain insights into file transactions with detailed statistical reports, offering a clear overview of activities. 
  4. Cloud Configurations: Seamlessly configure cloud providers, allowing flexibility in storing partner-related keys/passwords. 
  5. Access Management: Empower Super Admins and Admins with the ability to create, update, or delete users and manage trading partner buckets. 
  6. Trading Partner Module: Set up partners with ease, choosing from various protocols (SFTP, FTP, FTPS) and cloud storages (GCS, S3, Azure, IBM-Object-Storage). 
  7. File Operator Module: Search, upload, and download files directly from assigned buckets, providing efficient file management. 
  8. Settings Module: Configure schedulers in the Polling Interval submodule for automated tasks. 
  9. File Replay: Restore deleted files and reprocess them from clouds to servers with the File Replay feature. 

4. Glossary

Partner External entities who send/receive files.
Endpoint Connections for transferring files.
Gateway Organization/Data Gateway Server.
Push to Gateway Partner connects to Data Gateway Server and drops the file.
Push to Partner Data Gateway connects to Partner’s remote server and drops the file.
Pull from Partner Data Gateway connects to Partner’s remote server and pulls the file.
Pull from Gateway Partner connects to Data Gateway Server and pulls the file.
Pickup Directory The directory where the Client will pick up the files.
Used for Pull from Partner and Push to Gateway.
Drop Directory The directory where the Client will drop the files.
Used for Push from Hub and Push from Partner.

5. System Requirements

The Data Gateway Application is supported with the following System configurations: 

 

Minimal 

Recommended 

Operating System 

Linux (RHEL) 8 

Linux (RHEL) 9 

RAM 

8 GB 

8 GB 

Disk 

50 GB 

100 GB 

CPU 

4 cores 

8 cores 

6. Application Access

The Data Gateway UI URL can be accessed once the deployment is complete using the hostname provided in the deployment configuration. It prompts you for the login credentials. Upon first login, the Administrator can login with the factory credentials [Expl0re@123], which must be changed through Access Management after the initial login. This is the process of the local authentication, where the credentials are stored within the Data Gateway. 

The application can also be deployed using a SAML profile, allowing authentication to be handled via SAML. In this process, the credentials are stored in Identity Provider (IDP) (Example: OKTA) and not in the Data Gateway. 

6.1 Authentication Process

The Data Gateway Application employs a robust two-factor authentication system, in case the application is deployed without SAML for enhanced security. Users need to follow these steps for secure access: 

  1. Enter a valid registered Username and Password in the Data Gateway UI. 
  2. Upon successful login, a unique one-time password (OTP) is sent to the user’s registered email address. 
  3. Input the received OTP to complete the authentication process. 

6.2 Application Swagger UI

Once the Data Gateway Application is deployed, users can access the application APIs through Swagger UI, providing a convenient and interactive interface for exploring and interacting with the application’s endpoints. Swagger UI simplifies the process of understanding and testing different functionalities through the APIs available. 

The APIs are enabled with both basic authentication and Token Authentication.  

7. Roles

The Data Gateway Application features role-based access control: 

S NO 

Role 

Accessible Modules 

1. 

Super Admin 

Dashboard, File Management, End Point Management, Access Management, Alert Management, Cloud Configurations and Settings. 

2. 

Admin 

Access Management (Create and Manage User and Group) 

3. 

File Operator 

Dashboard and File Management. 

4. 

File Manager 

Dashboard, File Management, End Point Management, Alerts Management, Settings. 

5. 

Business User 

Dashboard, File Transfers, End Point Management (Manage Endpoint), Alerts Management (Manage Alerts & Alerts Dashboard) 

 

8. Dashboard Reports (Statistics)

The Data Gateway Dashboard is your central hub for monitoring and managing file transactions. The dashboard provides a comprehensive summary of todays and recent transactions, ensuring you stay informed about your data transfer activities. Below is an overview of key metrics: 

In each report, you’ll find the following details: 

  1. Total Files: The overall count of files uploaded/downloaded during the specified time frame. 
  2. Total Size: The total size of files uploaded/downloaded throughout the week, today, or this month (in Bytes, KB, MB, GB, TB…). 
  3. Success: Number of files successfully uploaded/downloaded during the specified time frame (in Bytes, KB, MB, GB, TB…). 
  4. Failure: Number of files that encountered upload/download failures during the specified time frame (in Bytes, KB, MB, GB, TB…). 

These fields are consistent across all reports, providing a standardized view of your data transfer statistics. Stay tuned for real-time updates on your data transfer activities. 

9. Cloud Configurations

Cloud Configurations are a pivotal component of the Data Gateway, providing users with the ability to seamlessly integrate with various cloud providers for efficient storage and retrieval of files. This section guides users through the configuration process for different cloud providers, ensuring a secure and reliable data transfer experience. 

9.1 GCS Cloud Configurations

9.1.1 Service account creation

  1. Sign in to [Google Cloud Console] 
  2. Select an existing Project or create Google Cloud Project. 
  1. Navigate to IAM & Admin -> “Service Accounts.
  1. Click “+ CREATE SERVICE ACCOUNTand fill in the required details.
  1. After creating the service account search with value and open the service account and assign the necessary roles/permissions (e.g., Storage Admin).
  1. Create and download a JSON key file.  
  2. Go to the KEYS –> ADD KEY 
  1. By Clicking on the ADD KEY we can see
    • Create new key  
    • Upload existing Key 
  2. We can generate as .JSON or .P12 files as shown below. 
  1. Select the Key Type as JSON and click on the Create button we can get one JSON file as shown below. 

Example .json file 

9.1.2 Pub/Sub Creation

Select the Pub/Sub category in the Google Cloud account and set up the Topics and Subscriptions. 

9.1.2.1 Topic Creation:
  1. Click on the Topics –> Create Topic, and provide the Topic ID. 
  1. When you create or update a topic, you must specify its properties.
    • Add a default subscription. Adds a default subscription to the Pub/Sub topic. You can create another subscription for the topic after the topic is created. The default subscription has the following properties: 
      • Subscription ID of -sub 
      • Pull delivery type 
      • Message retention duration of seven days 
      • Expiration after 31 days (about 1 month) of inactivity 
      • Acknowledgment deadline of 10 seconds 
      • Immediate retry policy
  • Use a customer-managed encryption key (CMEK). Specifies if the topic is encrypted with a CMEK. Pub/Sub encrypts messages with Google-owned and Google-managed keys by default. If you specify this option, Pub/Sub uses the envelope encryption pattern with CMEK. In this approach, Cloud KMS does not encrypt the messages. Instead, Cloud KMS encrypts the Data Encryption Keys (DEKs) that Pub/Sub creates for each topic. Pub/Sub encrypts the messages using the newest DEK that was generated for the topic. Pub/Sub decrypts the messages shortly before they are delivered to subscribers. 
9.1.2.2. Create Subscription:
  1. Click on the Subscription –> Create Subscription, provide the Subscription ID 
  1. We can use the default existing subscription or create a new one by selecting Create Topic. 
  1. If we select the create Topic console will provide another pop up to create topic: 
  1. Schema. A schema is a format that the message data field must follow. A schema is a contract between the publisher and subscriber that Pub/Sub enforces. Topic schemas help standardize message types and permissions to allow them to be consumed by different teams in your organization. Pub/Sub creates a central authority for message types and permissions.
    • Enable ingestion. Enabling this property lets you ingest streaming data from external sources into a topic so that you can leverage the features of Google Cloud.  
    • Message retention duration. Specifies how long the Pub/Sub topic retains messages after publication. After the message retention duration is over, Pub/Sub might discard the message regardless of its acknowledgment state. Message storage fees are charged for storing all messages published on the topic.
      • Default = Not enabled 
      • Minimum value = 10 minutes 
      • Maximum value = 31 days (about 1 month) 

       

      

9.1.3 Bucket Creation

Step 1: Log in to Google Cloud Console

  1. Open your web browser and go to Google Cloud Console. 
  2. Log in with your Google account. 

Step 2: Select Your Project

  1. Click the project selector dropdown at the top. 
  1. Choose an existing project or create a new project. 
  1. Ensure billing is enabled for your project. 
Step 3: Navigate to Cloud Storage
  1. In the left sidebar menu, go to Buckets and Click “Create” at the top. 

Step 4: Enter a Unique Bucket Name

  1. The bucket name must be globally unique across all of Google Cloud. 
  2. It must:  
    • Contain only lowercase letters, numbers, dashes (-), and underscores (_). 
    • Be 3-63 characters long. 
    • Not start or end with a dash (-). 
  3. Example: my-gcs-bucket-123 

Step 5: Select a Storage Location

The location determines where your data is stored and affects performance and cost. You have three options: 

  • Region (e.g., us-central1): Data is stored in a single region. 
  • Dual-region (e.g., nam4): Data is replicated in two locations. 
  • Multi-region (e.g., US): Data is stored across multiple locations for higher availability.

Step 6: Choose a Storage Class

The storage class determines the cost and retrieval speed: 

  • Standard (default) – Best for frequently accessed data. 
  • Nearline – Best for data accessed once a month. 
  • Coldline – Best for data accessed once a year. 
  • Archive – Best for long-term storage (10+ years). 

Step 7: Configure Access Control

You can choose between: 

  • Uniform (Recommended) – Permissions apply at the bucket level. 
  • Fine-grained – Permissions apply to individual objects. 

 

Step 8: Set Advanced Settings (Optional)

8.1. Soft-delete policy (for data recovery) 

  • When enabled, objects deleted from the bucket are not immediately erased. Instead, they are retained for a specified period before permanent deletion. 
  • This allows recovery of mistakenly deleted objects within the retention period. 

Options: 

  • Use default retention duration -> The default retention period is 7 days unless changed by an administrator. 
  • Set custom retention duration -> Allows you to specify a custom duration for how long deleted objects should be retained. 

8.2. Object Versioning (for version control) 

  • When enabled, Cloud Storage keeps multiple versions of an object, allowing rollback to previous versions. 
  • Useful for protecting against accidental overwrites or deletions. 

Fields: 

  • Max. number of versions per object -> Defines how many previous versions of an object are stored. 
  • Setting this to 1 allows only one previous version (not recommended for full overwrite protection). 
  • Increasing this number ensures more backup versions are kept. 
  • Expire non-current versions after X days -> Automatically deletes older object versions after the specified period. 
  • Helps manage storage costs. 
  • Recommended: 7 days for Standard storage class. 

 

8.3. Retention (for compliance) 

  • Prevents objects from being deleted or modified for a specific period. 
  • Useful for regulatory compliance and data protection policies. 

Options: 

  • Set bucket retention policy -> Applies a uniform retention period for all objects in the bucket. 
  • Enable object retention -> Allows retention settings at the individual object level. 

Step 9: Enable Encryption (GCP-managed or Customer-managed keys) and Create the Bucket

  • Default google will provide Google-managed encryption key and Click “Create” to finalize the setup. 

9.1.4 Google Cloud Configuration in Data Gateway

The Cloud Provider configurations are setup for onboarding the Partners. The Data Gateway allows users to store partner-related keys or passwords in cloud-native secret managers, enhancing security and flexibility. In the absence of specific configurations for cloud secret managers, the keys or passwords are encrypted and stored in the database by default. 

To leverage cloud-native Secret Managers, users need to enable the service by clicking on the ‘Secret Manager’ checkbox. Authentication is then conducted based on the configured settings. It’s important to note that, currently, Data Gateway supports single user accounts for cloud providers. 

 

  1. Navigate to Cloud Configurations. 
  2. Click on add icon (+) that display over GCS Configurations tab.
  1. You can select the GCS Secret Manager Service
    • This option is for enabling Google Cloud Secret Manager service. 
    • It allows users to store, manage, and access sensitive information such as API keys, passwords, or certificates securely. 
    • Only select this option if the integration is related to secret management instead of database.
  2. Configuration Details: 
    • Storage Service Name: Provide any unique name for the identification of the GCS Storage Service. 
    • Bucket Name: Provide the name of the GCS bucket where files will be stored. This bucket serves as a container for data that needs to be archived or backed up. You can specify a new bucket name or an existing one. If using an existing bucket, please provide the associated topic and subscription names. If the bucket does not have any associated topics or subscriptions, create new topic and subscriptions and enter the topic and subscription names in the respective fields. 
    • Auth File: Users need to provide the Google-Auth.json file. 
    • Project ID: Provide the Project ID which was created in GCS.  
    • Topic Name: Provide the Pub/Sub Topic Name where messages are published whenever an event happens in the bucket (like object creation, update, or deletion). Currently we are checking only object creation events. 
    • Subscription Name: Provide the Subscription Name associated with the Pub/Sub topic. Subscriptions allow services to receive messages from the topic

9.2 Amazon Web Services (AWS) Configuration

9.2.1 IAM User Creation

  1. Sign in to the [AWS Management Console] (https://aws.amazon.com/console/).  
  2. Navigate to “IAM” -> “Users”.  
  1. Click “Add user” and provide a username
  1. Assign “Programmatic access” and click on next 
  1. Attach necessary policies (e.g., AmazonS3FullAccess)
  1. Once User Created successfully, inside the user details page, navigate to the Security credentials tab –> Create Access Key: Scroll down to the Access keys section, then click on Create access key 
  1. Click on Create access key and click on download the access key for (Access Key ID and Secret Access Key).  

9.2.2 Steps to Configure AWS

  1. Navigate to Cloud Configurations. 
  2. Click on add icon that display over AWS Configurations tab.  
  1. Select the AWS Storage Service checkbox. 
    • This option is to enable AWS Cloud Storage service integration. 
    • When checked, it allows users to configure storage buckets in AWS. 
  2. You can select the AWS Secret Manager Service 
    • This option is for enabling AWS Secret Manager service. 
    • It allows users to store, manage, and access sensitive information such as API keys, passwords, or certificates securely. 
    • Only select this option if the integration is related to secret management instead of database. 
  3. Configuration Details: 
    • Bucket Name: Provide the name of the AWS S3 bucket where files will be stored. This bucket acts as a container for data that needs to be archived or backed up. You can provide a new bucket name or can provide the existing bucket name. 
    • Access Key: The unique AWS Access Key ID used to authenticate API requests. It is required to grant programmatic access to AWS services. 
    • Secret Key: The secret key associated with the Access Key ID. This key ensures secure authentication when accessing AWS resources. 
    • Region: Provide the AWS region where the S3 bucket is hosted. The region selection ensures the data is stored in the correct geographical location based on performance and compliance requirements 
    • Queue Name: The name of the AWS SQS queue used for processing messages or events. This queue handles asynchronous tasks related to file storage. We can provide any custom name in this field. 
    • Queue Name: This is the name assigned to the queue in the cloud service (e.g., AWS SQS). It helps identify the queue used for message processing and communication between different services. 
    • Queue URL:  The unique URL endpoint for the queue. This URL is used by applications to send and receive messages from the queue. It is automatically generated when the queue is created in services like AWS SQS. 
    • Queue ARN: The Amazon Resource Name (ARN) is a globally unique identifier assigned to the queue in AWS. It provides a standardized way to reference the queue across AWS services and policies, ensuring proper access control and integration. 

     

9.3 Microsoft Azure Configuration

9.3.1 Storage Account Creation

  1. Sign in to the [Azure Portal](https://portal.azure.com/).  
  2. Navigate to “Storage accounts”.  
  1. Click “+ Add” and fill in the required details to create new storage account or choose existing one. 
  2. Navigate to the storage account -> “Access keys”.  
  1. Click on “Access Keys” to get connection string. 

9.3.2 Steps to Configure Azure:

  1. Navigate to Cloud Configurations. 
  2. Click on add icon that display over Azure Configurations tab. 
    • When checked, it allows users to configure storage buckets in Azure. Select the Azure Storage Service checkbox.
      • This option is to enable Azure Cloud Storage service integration. 
  1. Storage Service Name: Provide any unique name to identify the storage service configuration. 
  2. Connection String: The Azure Storage Account Connection String, which is required to authenticate and connect to the Azure Blob Storage service. 
  3. Container Name: The name of the Azure Blob Storage Container where the files will be stored. 
  4. Region: Specifies the Azure Region where the storage account is hosted, such as East US, West Europe, or Central India. 
  5. Archive File Interval: Provide the lifecycle policy for Azure Storage, facilitating the replay of files within a specified period. Specifies the number of days after which files will be automatically archived from the storage bucket. 
  6. Event Hub: The Event Hub Name that will be used to publish or subscribe to events. 
  7. Event Connection String: The Azure Event Hub Connection String used to connect to the Event Hub for monitoring file activities or real-time data streaming. 
  8. You can select the Azure Secret Manager Service 
    • This option is for enabling Azure Secret Manager service. 
    • It allows users to store, manage, and access sensitive information such as API keys, passwords, or certificates securely. 
    • Only select this option if the integration is related to secret management instead of database. 
  9. Key Vault Name: Provide the Azure Key Vault where sensitive information like secrets, certificates, and keys are securely stored. 
  10. Subscription Id: 
    • The Subscription Id is a unique identifier assigned to your Azure subscription. 
    • This ID links the Key Vault service to the specific Azure subscription under which the services are being used. 
    • You can find this Subscription Id in the Azure Portal under Subscriptions. 
  11. Resource Group Name: 
    • A Resource Group Name is a container that holds related Azure resources for your application. 
    • It organizes all the Azure services like Key Vault, Storage Accounts, and Virtual Machines into a group. 
    • Enter the exact resource group name under which the Key Vault is created. 
  12. App Client Id: 
    • The App Client Id is the Application (Client) ID registered in Azure Active Directory. 
    • This ID identifies the application that needs to access Azure resources. 
    • You can get this ID from the Azure App Registration under Azure Active Directory. 
    • This value needs to be copied while generating the secret in the Azure portal, as it will not be visible again. Client Secret:
      • The Client Secret is a secret key generated during the App Registration process. 
      • It acts like a password and is used to authenticate the application to access the Azure services. 
  13. Tenant Id: 
    • The Tenant Id is the Directory ID of your Azure Active Directory. 
    • It identifies the Azure AD instance where your application is registered. 
    • You can find this value in the Azure Active Directory Overview page. 

9.4 IBM Cloud Configuration

9.4.1 Service Credential Creation

  1. Sign in to the [IBM Cloud Console] (https://cloud.ibm.com/).  
  2. Navigate to “Resource List” -> “Storage” –>  “Cloud Object Storage”.  
  1. Click on your instance and navigate to **Service credentials**.  
  1. Click “New credential” and provide a name. 
  1. After clicking on New Credential, a popup window will appear as shown below. Please fill in the required details and click on Add to proceed. 
  1. Create and save the generated credentials like API key, Access Key, Secret Key and service instance ID details which will be required at later stage. 

9.4.2 Steps to Configure IBM Cloud:

  1. Navigate to Cloud Configurations. 
  2. Click on add icon that display over IBM-Cloud Configurations tab. 
  1. Select the IBM Storage Service checkbox.  
  • This option is to enable IBM Cloud Storage service integration. 
  • When checked, it allows users to configure storage buckets in IBM Cloud. 
  1. Access Key: The Access Key generated from the IBM Cloud Object Storage service, used to authenticate API requests. 
  1. Secret Key: The Secret Key associated with the Access Key, providing secure authentication. 
  1. Endpoint  URL: The API Endpoint URL for the IBM Cloud Object Storage service, which specifies the cloud storage service location (e.g., https://s3.us-south.cloud-object-storage.appdomain.cloud). 
  1. Bucket Name: The name of the IBM Cloud Object Storage Bucket where files will be stored. 
  1. Region: The IBM Cloud Region where the object storage service is hosted. 
  1. Archive File Age: Like other providers, this parameter enables the lifecycle policy for IBM Cloud, facilitating the replay of files within a specified period. 
  1. You can select the IBM Secret Manager Service 
  • This option is for enabling IBM Secret Manager service. 
  • It allows users to store, manage, and access sensitive information such as API keys, passwords, or certificates securely. 
  • Only select this option if the integration is related to secret management instead of database. 
  1. API Key: 
  • The API Key is a unique identifier that provides secure access to IBM Cloud services. 
  • This key is used to authenticate the application or service when connecting to IBM Secret Manager. 
  • It is generated from the IBM Cloud Console while creating service credentials under the Secret Manager Service. 
  • Without the API key, the system will not allow access to the secret manager. 
  1. Service Url: 
  • The Service URL is the endpoint URL of the IBM Secret Manager Service. 
  • It specifies the location where the API requests are sent. 
  • The URL typically follows the format: https://<region>.secrets-manager.appdomain.cloud. 
  • The correct service URL is mandatory to connect with the respective IBM cloud secret manager service. 
  1. Secret Type:
    • This field defines the type of secret stored in the IBM Secret Manager. 
    • Supported secret types include: 
    UsernamePassword: Store credentials like username and password.  IAM Credentials: Store IBM cloud IAM access credentials.  API Key: Store API keys for various services.  Certificates: Store SSL/TLS certificates. 
    • Based on the selected secret type, the application will fetch and use the respective secret details. 
  1. Secret Group Name: 
  • A Secret Group Name is a logical container to organize multiple secrets within the IBM Secret Manager. 
  • It helps in categorizing secrets based on environment, project, or application. 
  • For example, separate secret groups for Development, Testing, and Production environments. 
  • This field ensures that secrets are accessed from the correct group. 

10. Access Management

The Access Management module in Data Gateway is a powerful tool available to Super Admins and Admins, providing them with the necessary authority to manage users effectively. This module facilitates the creation, updating, and deletion of user accounts, along with the ability to modify the active or inactive status of users. Additionally, admins can assign trading partner buckets as needed, directly loaded from the Google Cloud Storage (GCS) console. 

The features of the Access Management varies based on the deployed profiles. The below sections are applicable if the application is deployed on local authentication without SAML. In case of SAML based deployment, please refer here for details around the Access Management. 

10.1 Accessing the Access Management Module

To access the Access Management Module, follow the steps illustrated below: 

10.2 Create User

  1. Select the Access Management Menu and click on Create User to create a new user. 
  1. Users can be created with role-based access, ensuring tailored and secure user management experience. 
  2. Please refer to 6. Roles for more information about each of the Roles. 
  3. In case of limited access roles, you will have the option to assign specific endpoints/directories to the user. 
  1. Once you select the Storage Type and the Service Account Name, you will be able to see the available Endpoints and you can pick and put them under the Directories for Upload and Download as per your business requirements. 
  2. Ensure to provide valid email addresses, as the initial password and activation process happens through email. 
  3. After user creation, an email for setting the password is automatically sent to the user.  
  1. Upon clicking the link, the user is directed to the User Interface, where password can be set. 
  1. Password Security Measures: 
  • Users are recommended to change their password in the ‘Change Password’ module, conveniently located on the right side of the user profile. 
  • Clicking on the username reveals the ‘Change Password’ module, allowing users to update their password. 
  • After updating the password, it is essential to log out of the application and log back in using the newly updated credentials for enhanced security. 
 

10.3 Manage User

  1. In the ‘Manage User’ screen, users with Super Admin and Admin privileges gain the ability to:  
    • Activate/deactivate user accounts based on operational needs. 
    • Edit and Update user accounts with modified information. 
    • Delete user accounts that are no longer required. 
  2. Access to the ‘Manage User’ screen is enabled to Super Admins and Admins only, ensuring secure and controlled user management capabilities.

10.4 Create Group:

  1. Groups can be created from User Management -> Create Group. 
  2. It can be created based on the Cloud Service Account Name of the Cloud Provider by assigning the group of partners to the Group.  
    a. You can pick and place the required Partners from the Available to Assigned pane.
  3. The Groups can also be assigned to User with File Manager Role, where the user can deal with the partners of the specified group. 

10.5 Manage Group:

  1. Groups can be managed through User Management -> Manage Group. 
  2. Users with Super Admin and Admin privileges gain the ability to: 

    Update Groups with modified information. 

    Delete Groups that are no longer required. 

11. Endpoint Management Module

The Endpoint Management Module in Data Gateway is a robust feature that enables users to seamlessly set up partnerships using a diverse range of protocols and cloud storage options. This module supports protocols such as SFTP, FTP, FTPS, along with various cloud storage providers including GCS, S3, AZURE, and IBM-OBJECT-STORAGE. 

11.1 Create Endpoint

Endpoints can be set up through industry certified protocols and Data Gateway enhances it by providing cloud storage options tailored to the requirements. The application takes care of setting up  the appropriate cloud storage buckets. 

  1. Endpoint can be created from Endpoint Management -> Create Endpoint. 
  1. Provide the Endpoint details. 

    Endpoint Name: Provide the Endpoint Name. 

    Endpoint ID: Provide a valid Endpoint ID to identify the Endpoint. 

    Email: Provide the email ID of the Endpoint contact for reference. 

    Phone: Provide the phone number of the Endpoint Contact. 

    Protocol: Choose from FTP, SFTP, FTPS, etc. 

    Status: Provide the status of the endpoint, whether it is Active or Inactive. 

    Storage Type: Choose from GCS, S3, AZURE, IBM-OBJECT-STORAGE. 

    Service Account Name: Choose from the account which was set up earlier. Refer to 8. Cloud Configurations for more information. 

    Field 1, Field 2: You can provide additional custom metadata which you would like to store. 

  1. The Protocol selection would provide the screen with additional details for the protocol transmission. 
The information related to the protocol details are provided in the upcoming topic. You can click here to go directly to the Protocol Section. 

11.2 Manage Endpoint

  1. Endpoints can be managed from Endpoint Management -> Manage Endpoint 
  1. It gives you the list of all Endpoints, from which you can activate/deactivate, edit, delete using the provided action buttons on the right side. 

11.3 Protocols

Data Gateway supports multiple protocols, including FTP, FTPS, and SFTP, facilitating secure and reliable file transfers.

11.3.1 FTP (File Transfer Protocol)

FTP (File transfer Protocol) is a communication protocol utilizing the client server connection model. In Data Gateway, it can be used to act both as a Server(Push-Pull) and Client(Pull-Push). The configuration can be done for all combinations of PULL and PUSH. The connection is performed from the Cloud to the External Parties in Data Gateway through secure layer implemented in the Secure Component. 

Pull-Push

This is where Data Gateway acts as the Client and connect to the Endpoint provided. 

  • Connection Type: Select between FTP Active and Passive modes, facilitating server-client data transfers. 
  • Remote Host: Host Name of the FTP Server to which the connection is established. 
  • Remote Port: Port number of the FTP Server for the connection. 
  • Transfer Type: Type of the transfer (Binary / ASCII). 
  • Username: Username of the remote FTP server. 
  • Password: Password of the remote FTP server. 
  • No. Of Retries: Number of times a user can attempt to connect to the server. 
  • Retry Interval: Time gap to attempt to access the server if the connection fails. 
  • Remote file pattern: Regex pattern of the files to be pulled from the remote server. 
  • Delete after collection: Check box to delete the file or not in remote FTP server once it is pulled. 
  • Pickup Directories: Path of the directories in remote FTP server from where the files need to be pulled. (Pull from Partner) 
  • Drop Directories: Path of the directories in remote FTP server to which the files need to be pushed. (Push to Partner) 
  • Polling Interval: Interval in which the session to be created and files need to be collected. 
  • Archive File Interval (Days): No of days the file should be archived on deletion. The files will be archived in the cloud storage as non-current version and would be used for file replays. 
Push - Pull

This is where Data Gateway acts as the Server and allows External Parties to connect for File Exchanges. 

  • Username: Username using which the session to be opened to data gateway. 
  • Password: Password for authentication. 
  • IP Range: Range of IP addresses to allow. 
  • Rate Limit: No of concurrent sessions to be opened. 
  • Virtual Root: Provide the Path which would act as the first level of the directory path for the user. 
  • Pickup Directory: Directory from the where the files need to be pulled by the Partner. (Pull from Gateway) 
  • Drop Directory: Directory to which the files need to be pushed. (Push to Gateway) 
  • Permission: Select user folder permissions, like READ DATA, WRITE DATA, APPEND DATA. 
  • Archive File Interval (Days): No of days the file should be archived on deletion. The files will be archived in the cloud storage as non-current version and would be used for file replays. 
Push- Push Scenario

This is where Data Gateway acts as the Server and allows External Parties to connect for receiving File Exchanges and acts as the Client to send Files back to External. 

This configuration would be the combination of the above two configurations (Push-Pull and Pull-Push). 

11.3.2 FTPS (FTP Secure)

FTPS (File transfer Protocol Secure) is a communication protocol utilizing the client server connection model, along with SSL Certificates. In Data Gateway, it can be used to act both as a Server(Push-Pull) and Client(Pull-Push). The configuration can be done for all combinations of PULL and PUSH. The connection is performed from the Cloud to the External Parties in Data Gateway through secure layer implemented in the Secure Component. 

Pull - Push

This is where Data Gateway acts as the Client and connect to the Endpoint provided. 

  • Connection Type: Select between FTPS Active and Passive modes, facilitating server-client data transfers. 
  • Remote Host: Host Name of the FTPS Server to which the connection is established. 
  • Remote Port: Port number of the FTPS Server for the connection. 
  • Transfer Type: Type of the transfer (Binary / ASCII). 
  • Username: Username of the remote FTPS server. 
  • Password: Password of the remote FTPS server. 
  • No. Of Retries: Number of times a user can attempt to connect to the server. 
  • Retry Interval: Time gap to attempt to access the server if the connection fails. 
  • Remote file pattern: Regex pattern of the files to be pulled from the remote server. 
  • Delete after collection: Check box to delete the file or not in remote FTPs server once it is pulled. 
  • Pickup Directories: Path of the directories in remote FTPs server from where the files need to be pulled. (Pull from Partner) 
  • Drop Directories: Path of the directories in remote FTPs server to which the files need to be pushed. (Push to Partner) 
  • Polling Interval: Interval in which the session to be created and files need to be collected. 
  • Archive File Interval (Days): No of days the file should be archived on deletion. The files will be archived in the cloud storage as non-current version and would be used for file replays. 
  • Upload Cert: Upload remote FTPS server’s certificate. 
Push – Pull:
  • Username: Username using which the session to be opened to data gateway. 
  • Password: Password for authentication. 
  • IP Range: Range of IP addresses to whitelist. 
  • Rate Limit: No of concurrent sessions to be opened. 
  • Virtual Root: Mapping for the endpoint that serves as the root directory. 
  • Pickup Directory: Directory from the where the files need to be pulled by the Partner. (Pull from Gateway) 
  • Drop Directory: Directory to which the files need to be pushed. (Push to Gateway) 
  • Archive File Interval (Days): No of days the file should be archived on deletion. The files will be archived in the cloud storage as non-current version and would be used for file replays. 
  • Permission: Select user folder permissions, like READ DATA, WRITE DATA, APPEND DATA. 
Push- Push Scenario

This is where Data Gateway acts as the Server and allows External Parties to connect to our FTPS Server for receiving File Exchanges and acts as the Client to send Files back to External by connecting to External Parties FTPS Server. 

This configuration would be the combination of the above two configurations (Push-Pull and Pull-Push). 

11.3.3 SFTP (SSH File Transfer Protocol)

SFTP (SSH File transfer Protocol) is a widely used communication protocol utilizing the client server connection model, involving Private/Public Key Authentication Mechanism. In Data Gateway, it can be used to act both as a Server(Push-Pull) and Client(Pull-Push). The configuration can be done for all combinations of PULL and PUSH. The connection is performed from the Cloud to the External Parties in Data Gateway through secure layer implemented in the Secure Component.

Pull - Push

This is where Data Gateway acts as the Client and connect to the Endpoint provided. 

  • Remote Host: Host Name of the SFTP Server to which the connection is established. 
  • Remote Port: Port number of the SFTP Server for the connection. 
  • Preferred Mac Algorithm: Select the mac algorithm from the drop down (HMAC-MD5, HMAC-SHA1, HMAC-SHA2-256) MAC algorithm is used to ensure data integrity and authenticity during file transfer. 
  • Remote User: Username of the remote SFTP server. 
  • SSH Password:  Password of the remote SFTP server. 
  • Character Encoding: In SFTP ensures the text data is correctly interpreted and displayed across different systems and platforms. Usually UTF-8 or UTF-16. 
  • Connection Retry count: Number of times a user can attempt to connect to the server. 
  • Retry Delay: Time gap to attempt to access the remote server if the connection fails. 
  • Preferred cipher: The preferred cipher in SFTP specifies the encryption algorithm used to secure data during transfer. Select the preferred ciphers from drop down list. 
  • Auth Type: Select the authentication type as Password or Public Key. 
  • In case of Public Key, Data Gateway will create the Key and share the public part to the Endpoint email address.  
  • Response timeout: Response timeout in SFTP defines the duration the client waits for a response from the server before considering the connection unresponsive. 
  • Remote file pattern: Regex pattern of the files to be pulled from the remote server. 
  • Delete after collection: Check box to delete the file or not in remote SFTP server once it is pulled. 
  • Pickup Directories: Path of the directories in remote SFTP server from where the files need to be pulled. (Pull from Partner) 
  • Drop Directories: Path of the directories in remote SFTP server to which the files need to be pushed. (Push to Partner) 
  • Polling Interval: Interval in which the session to be created and files need to be collected. 
  • Archive File Interval (Days): No of days the file should be archived on deletion. 
  • Upload Cert: Upload remote SFTP server’s public key when auth type is selected as Public Key. 
Push - Pull:
  • Username: User ID used to connect to the SFTP server. 
  • Password: Password used to connect to the SFTP server. 
  • Authentication Type: Select auth type as password or public key. If it is public key, then the authentication key will be sent to the mail. 
  • IP Range: Range of IP addresses to allow. 
  • Rate Limit: No of concurrent sessions to be opened. 
  • Virtual Root: Mapping for the endpoint that serves as the root directory. 
  • Delete after Collection: Option to delete the file after collecting it from the connected server. 
  • Pickup Directory: Directory from the where the files need to be pulled by the Partner. (Pull from Gateway) 
  • Drop Directory: Directory to which the files need to be pushed. (Push to Gateway) 
  • Archive File Interval (Days): No of days the file should be archived on deletion. 
  • Permission: Select user folder permissions, like READ DATA, WRITE DATA, APPEND DATA etc., 
Push- Push Scenario

This is where Data Gateway acts as the Server and allows External Parties to connect to our SFTP Server for receiving File Exchanges and acts as the Client to send Files back to External by connecting to External Parties SFTP Server. 

This configuration would be the combination of the above two configurations (Push-Pull and Pull-Push). 

Scroll to Top