Data Safety
How secure is your data?
For several years, we have been making every effort to secure the data entrusted to us by our customers and regularly strengthen our protection system.
Infrastructure
Databases and server exchanges are isolated on a dedicated private network, ensuring optimal separation and protection of sensitive information. To ensure continuous security, software components are regularly updated to prevent potential vulnerabilities. Administrator access is secured by the use of SSH certificates, which strengthens the protection of systems against unauthorized access.
Logging and active monitoring of activities are carried out through solutions such as SIEM, allowing for the rapid detection of any anomalies or intrusion attempts. At the same time, firewalls and DDoS protection devices are deployed to prevent any attempt at system overload or malicious access.
To secure privileged access, bastions and privileged access management (PAM) solutions are used, ensuring strict control over users with access to sensitive functions. In addition, all exchanges between servers are encrypted, ensuring the confidentiality and integrity of the data transmitted. Finally, the use of secure protocols such as TLS for all communications strengthens protection against interceptions and man-in-the-middle attacks.
These combined measures ensure robust and continuous defense against threats, thus preserving the integrity and security of systems and data.
Applications
The security of exchanges and data is a key priority in our infrastructure. Thus, the connection between the different systems is secured end-to-end via TLS, guaranteeing the confidentiality of the information exchanged and protecting against any malicious interception. To strengthen access protection, passwords are generated with an entropy greater than 50 and stored securely using a one-way hash, making them virtually impossible to decrypt.
Access control to resources is strictly regulated, with each user having only the permissions necessary for their functions, thus limiting the risks of abuse or human error. In order to protect against common attacks such as SQL injections or XSS attacks, rigorous input filtering is implemented, ensuring that only valid data is processed.
To prevent Cross-Site Request Forgery (CSRF) attacks, security tokens are used, ensuring that each request is legitimate and issued by an authorized user. The deployment process is fully automated, eliminating the risks associated with direct developer access to production servers and ensuring secure update management.
Finally, web firewalls are used to provide additional protection against threats coming from the Internet, blocking unauthorized access attempts and strengthening the security of the entire infrastructure. By integrating these practices and technologies, we offer a highly secure environment, resistant to the most advanced threats.
Multi-Factor Authentication (MFA)
Incorporating an additional layer of security by requiring a second verification (e.g. a code sent via SMS or an authenticator app) when accessing sensitive resources.
Encryption of sensitive data
In addition to encrypted connection, we consider encrypting stored sensitive data (e.g. personal data, banking information) using strong algorithms like AES-256.
Audit and monitoring of access
Implementing detailed logging systems to track all access to critical resources, with regular review of logs to identify any suspicious activity.
Setting up key management
Using a key management solution (e.g., HashiCorp Vault or AWS KMS) to centralize management and rotation of encryption keys.
Regular penetration testing
Scheduling periodic penetration tests to identify and correct potential vulnerabilities in the system before they are exploited.
Securing APIs
Added strong authentication mechanisms for APIs, such as using OAuth2 with time-limited access tokens.
Protection against DDoS attacks
We are considering integrating services to protect against distributed denial of service (DDoS) attacks, for example using solutions like Cloudflare or AWS Shield.
Regular update of dependencies
Automation of update management of libraries and frameworks used to avoid any vulnerabilities linked to obsolete versions.
Execution in isolated environments (containers)
Deploy applications in isolated environments like Docker or Kubernetes, limiting container permissions and enforcing strict security policies.
File Integrity Checks
Use of integrity verification mechanisms, such as checksums or digital signatures, to ensure that the deployed code is the intended one, without malicious modification.
Procedures and Business Recovery Plan (PRA)
To ensure the resilience and continuity of our services, we implement frequent backups, ideally hourly, and stored on multiple sites. This multi-site backup approach helps protect data from loss in the event of a disaster. We have also adopted geographic redundancy, by configuring active/active infrastructures, to ensure continuity of services even in the event of a site or region failure.
In order to test the effectiveness of our security and recovery plans, random tests are regularly carried out. These simulations involve all teams, ensuring that everyone is ready to react appropriately in the event of an incident. At the same time, strict access control is implemented through the RBAC (Role-Based Access Control) model, with regular audits of rights to ensure that no unauthorized person has access to sensitive resources.
Technical support receives ongoing training to maintain a high level of vigilance against security risks, supplemented by phishing tests to raise awareness among teams about phishing attempts. We also favor certified service providers and carry out regular audits to ensure their compliance with the strictest security standards.
Backup processes are automated to avoid human error and ensure optimal backup frequency. In the event of a crisis, an emergency communication plan is deployed, allowing for a rapid and coordinated response. Finally, cyber attack simulations are organized to test the resilience of our infrastructure against potential threats and prepare teams to respond effectively in real situations.
These practices, combined with rigorous and proactive risk management, ensure enhanced security and maximum resilience of our systems.
GDPR in business
Who is responsible?
The head of the company or the general manager is responsible for the data processed.
Consent
Companies have a public interest mission and are required to facilitate the exercise of rights related to personal data. We therefore recommend that you present the applications envisaged at the kick-off meeting of each project or contract, and provide a document template to each subcontractor involved, in order to obtain the consent of all parties for the use of the data. A template adapted to your situation is available in the section Our missions.
Duty to inform
When processing personal data for a customer, it is essential to notify the rights holders of all processing carried out and to provide them with a comprehensible version of this information.
Record keeping
Data controllers must maintain a register detailing all processing operations carried out, their purposes, the means deployed, the subcontractors involved, etc. Impact assessments must also be carried out in order to assess the potential consequences in the event of a data breach.
Choice of providers
To carry out their missions, companies use external service providers or engineers specializing in IT security. The latter act as subcontractors or direct employees of the company. It is the responsibility of the data controller to ensure their compliance with the GDPR.