AI Model and Data Use Notice
Effective Date: January 1, 2025
Last Updated: March 2026
Applies To: All Neptune Forge, Neptune Odyssey and AutoCounsel deployments
Contact: privacy@neptunetechnologies.com
1. Overview
Neptune Technologies Inc. is committed to transparency about how AI models are deployed, how customer data is handled, and under what circumstances data may be used for model training or fine-tuning.
Neptune operates as a hybrid AI platform. Depending on your subscription, configuration, and contractual agreements, your deployment may use one or more of three distinct service modes. This Notice explains each mode, what it means for your data, and what protections apply.
Neptune's default operation is to deploy and integrate AI models — not to train them. Custom training and fine-tuning are optional, client-initiated services performed only under a separate written agreement. Neptune does not train on customer data by default.
2. Neptune's Three Service Modes
Mode 1 — Third-Party Foundation Model Deployment
Neptune integrates with third-party AI providers (e.g., Anthropic Claude, OpenAI) accessed via API. Neptune configures and manages the integration; the underlying model is operated by the third-party provider.
Data flow: Prompts and data submitted in this mode are transmitted to the third-party provider's infrastructure for processing. Neptune does not control how the provider stores or processes this data.
Training: Neptune does not train or fine-tune any model in this mode. Third-party provider terms govern whether submitted data may be used for model improvement by the provider.
Your responsibility: Review the applicable upstream provider's data and privacy policies before submitting sensitive or personal data.
Mode 2 — Self-Hosted Open-Source Model Deployment
Neptune deploys an open-source AI model within a Neptune-managed or customer-managed secure infrastructure environment. No third-party model provider is involved.
Data flow: Data is processed entirely within the designated secure infrastructure. No data is transmitted to external model providers.
Training: Neptune does not train or fine-tune the model in this mode unless a separate fine-tuning agreement is in place (see Mode 3).
Your responsibility: Ensure data submitted to the deployment is appropriate for the infrastructure environment specified in your agreement.
Mode 3 — Custom Fine-Tuning Engagement
At customer request and under a separate written agreement, Neptune fine-tunes an open-source or foundation model using customer-provided data to create a custom model tailored to the customer's needs.
Data flow: Customer data is used solely for training the customer's designated model in a logically isolated environment. Data is not shared with or accessible by any other customer.
Training: Fine-tuning is performed only with explicit written authorization. The customer retains ownership of their training data and the resulting fine-tuned model.
Your responsibility: Confirm you hold legal rights to use submitted data for model training. Execute the required Data Use Agreement prior to commencement.
3. Data Governance by Mode
3.1 Third-Party Model Deployments (Mode 1)
Neptune will disclose to you which third-party model providers are in use for your deployment. We recommend reviewing the following before submitting sensitive data:
Anthropic's usage policies and privacy terms (for Claude deployments)
OpenAI's usage policies and privacy terms (for OpenAI deployments)
The terms of any other third-party provider applicable to your configuration
Neptune implements controls at the integration layer (e.g., prompt filtering, access controls, logging) but does not control the model provider's infrastructure or data retention practices.
3.2 Self-Hosted Deployments (Mode 2)
For self-hosted deployments, Neptune maintains:
Logical isolation between customer environments — no customer's data or model outputs are accessible to any other customer
Administrative, technical, and physical safeguards appropriate to the deployment environment
Access controls and audit logging for all data handling activities
Strict confidentiality obligations on all personnel with access to customer data
3.3 Fine-Tuning Engagements (Mode 3)
The following requirements apply to all fine-tuning engagements:
A separate written Data Use Agreement must be executed prior to any data being submitted for training
The customer must confirm in writing that it holds the legal rights to use the submitted data for AI training purposes
Training is performed in a dedicated, logically segmented environment. No training data or resulting model is shared with, accessible to, or used to benefit any other customer
All data is de-identified before training unless otherwise required and explicitly authorized in the Data Use Agreement
The resulting fine-tuned model is the customer's asset. Neptune will not use it for any other purpose or customer without explicit written authorization
Upon termination, customers may request export or secure deletion of their fine-tuned model and training data
4. Training Data Transparency
Neptune publishes the following information about datasets used in Neptune-operated or Neptune-developed model components, consistent with applicable laws including California AB 2013 (Generative AI Training Data Transparency Act, effective January 1, 2026).
Third-Party Models Neptune does not control or have visibility into the training datasets used by third-party providers (Anthropic, OpenAI, etc.). Refer to each provider's published model cards and data transparency documentation.
Open-Source Base Models Neptune deploys open-source models developed and released by third parties (e.g., Meta Llama, Mistral). Training data for these models is documented by their respective developers. Neptune does not modify base model training data.
Customer Fine-Tuning Data Provided entirely by the customer under a Data Use Agreement. Neptune does not select, source, or own this data. Customers are responsible for ensuring the dataset's provenance, IP clearance, and privacy compliance.
5. Privacy and Security
Across all service modes, Neptune maintains the following baseline privacy and security practices:
All data is handled in accordance with Neptune's Privacy Policy and applicable regulations including GDPR, CCPA, and LGPD
Neptune does not sell customer data or use it for purposes beyond delivering the Services
Neptune implements access controls, encryption at rest and in transit, logging, and audit trails
All personnel with access to customer data are subject to confidentiality obligations and security training
Neptune acts as a data processor for enterprise customers and as a data controller for self-service users
For fine-tuning engagements, additional safeguards apply as described in Section 3.3. Neptune will notify affected customers of any security incident involving their data in accordance with applicable breach notification laws.
6. Safety and Responsible AI
Neptune incorporates responsible AI practices across all deployment modes:
Model outputs are subject to safety evaluation and content filtering where technically feasible and appropriate to the deployment
Neptune prohibits use of training data that introduces discriminatory, unlawful, or harmful model behavior
Model behavior is tested against safety benchmarks prior to deployment in Neptune-managed environments
Neptune maintains an incident escalation process for safety-relevant model outputs. Customers may report safety concerns to privacy@neptunetechnologies.com
Neptune aligns its responsible AI practices with recognized frameworks including the NIST AI Risk Management Framework (AI RMF) and applicable provisions of the EU AI Act
For third-party model deployments, safety behaviors are also governed by the upstream provider's safety policies and filtering infrastructure. Neptune does not override or circumvent upstream safety controls.
7. Your Rights and How to Exercise Them
Depending on your jurisdiction, you may have the following rights regarding your personal data:
Access — available under GDPR, CCPA, LGPD. Contact privacy@neptunetechnologies.com.
Deletion — available under GDPR, CCPA, LGPD. Contact privacy@neptunetechnologies.com.
Portability — available under GDPR, EU Data Act, CCPA. Request a machine-readable export via privacy@neptunetechnologies.com.
Opt-Out of Training — applies to all users. No action required. Neptune does not use your data for training without a separate written agreement.
Correction — available under GDPR, LGPD. Contact privacy@neptunetechnologies.com.
Neptune will respond to verified requests within 30 days (or the timeframe required by applicable law). For requests relating to data processed by third-party model providers, Neptune will assist where possible but may direct you to the relevant provider for data within their control.
8. Updates to This Notice
Neptune will notify affected customers of material changes to this Notice at least 30 days before changes take effect, via email or account notification. Continued use of the Services after that period constitutes acceptance of the updated Notice.
Where changes affect how customer data is used for training or fine-tuning, Neptune will obtain fresh written consent before implementing those changes.
9. Contact
Neptune Technologies Inc. privacy@neptunetechnologies.com
For formal legal notices, send written correspondence marked "ATTN: Legal Department."
This Notice supplements Neptune's Terms of Service and Privacy Policy. In the event of conflict, the terms most protective of customer data apply.
Last Updated: March 2026
