September 2019 Pirum Limited
Ansible based deployment engine
Designed and implemented a multiplatform deployment engine, based on Ansible, allowing to build and provision physical and virtual machines, both in the cloud and on premise.
- Perfect compliance with infrastructure as code principles.
- Non-interractive Debian deployment from scratch, on both physical and virtual hardware.
- iVirtual machines backend for Proxmox and Linode.
- Integration with ansible-lint.
- Automatic documentation publishing from git.
- YAML schema validation.
- Compatibility with Microsoft Windows.
- Automatic provisioning reports publishing on Samba or Google Drive or sent by email.
- Integration with Jenkins Continuous Integration.
The engine enforced a strong separation of concern, using separate repositories for the infrastructure definition, the engine itself, and a huge collection of roles.
Strong isole: each role was written as an independent entity, with installation, uninstallation, testing and reporting behaviours.
Using the engine specified above, created Ansible roles to deploy data centre core modules
- Redundant external firewalls with connection tracking synchronisation.
- Redundant stateless internal firewalls.
- Transparent http and https proxy without https inspection.
- Dynamic routing for both internal and external firewalls.
- Dynamic IP address whitelisting synchronised with custom in house applications.
- Intelligent email interception at the firewall level, to avoid data leak during the deployment and testing phase.
- Migrated DNS servers to modern architecture.
The external and the internal firewalls successfully passed penetration test and a security audit, with only minor remarks
Cloud based remote working solution
Using the same engine, we have deployed an integrated solution for remote workers, using the Linode cloud provider. The VPN servers came with:
- Multiple redundant VPN servers.
- Certificate authentication and revocation list verification.
- Filtering proxy with authentication linked to Active Directory in the premises.
- Intrusion Detection Systems with very low false positives.
- Two factors authentication using Google Authenticator compatible client.
- Platform verification using custom scripts.
- Automatic connection for trusted hardware.
- Custom DNS server to dispatch internal and external queries.
- Continuous integration for certificates generation and revocation.
- Automatic email generation, with QR code.
- Full integration with git to manage the full process using pull requests.
- Prototype for integration with Cisco Umbrella.
Using the Samba servers for authentication and single sign on, I have built a redundant system to share files both in the office and over the VPN.
- Internal file share server.
- Migration from Puppet to Ansible.
- Windows provisioning.
December 2016 – June 2019: SeeQuestor Limited
Senior system administrator
Hardware and Software Installation
SeeQuestor is providing to British and international police forces, a complete platform for video conversion, ingestion, and analysis. The platform includes motion detection, face recognition, re-identification, etc. The whole stack is using nVidia GPU hardware and CUDA stack.
I have designed and implemented the whole hardware and operating system stack, necessary to provides these services, using the prototype already in place.
The platform also includes Windows based virtual and physical machines, for proprietary formats analysis and screen scrapping.
Its access is secured through two isolated virtual private networks, for administrators and customers, and is undetectable externally.
Amazon web services hosting
I used the same methodologies to deploy the platform on AWS, with little or no modifications. The platform was behind a load balancer and this time publicly accessible. Various strategies have been employed to secure the platform anyway.
Cyber Essential Plus certification
To work with national crime agencies in UK, SeeQuestor needed to be certified with strong security credentials, starting with Cyber Essential and Cyber Essential plus certifications.
I have designed and deployed the network infrastructure to optimise the working environment and be compliant to Cyber Essentials.
I have detailed the practices for the developers, the managers and the directors to follow in order to pass the security audit.
I have conducted some basic penetration tests before the official audit company.
Platform encryption prototype
I have implemented a platform self decryption on boot, to protect both intellectual property and unauthorised access. The platform is protected by a randomly generated OATH code generated offline, with a mandatory online validation method.
March 2015, November 2016: Bulb Software Limited
DevOps administrator and back-end developer
Hardware and Software Installation
Bulbthings developed an asset management application, to handle their life cycle efficiently, and reduce the costs. Any type of asset was possible, like cars, printers, phones, etc.
The application targeted companies of any size, from small and medium sized companies to corporate with international branches.
The initial application was architectured on top of a MySQL database, a PHP/Symfony back-end used as a REST server, and an AngularJS/Metronic front end. The Back-End was assorted with other programs for HTML web sockets, in go.
I took part in the rewriting the whole application stack, with robust continuous integration techniques and bottom-up vertical development. I have used modern technologies and best practices: For the database, we used PostgreSQL, a hybrid relational database, that allowed querying hierarchical data trees, dynamic schema updates, as well as geolocation and distance calculation.
The whole stack was thoroughly checked using a Continuous integration system, at many level:
- For the back-end: unit tests and integration tests with coverage analysis.
- Asynchronous testing of the Rest API on each commit.
- For the front-end: automated acceptance tests using two real web browsers and automatic screenshots.
Google Compute Engine
The instances were automatically deployed by Jenkins on the GCE. We first used an ensemble of bash scripts, but we later migrated them to Ansible.
Each instance was automatically deployed on a sub-domain and instantly accessible, without having to wait for DNS propagation.
We were using wildcard certificate to ensure simple https support, as well as http version 2 to reduce the charge.
We also used four front-ends to distribute the charge using DNS.
Python Import framework
To import customers data, I wrote a Python framework that automatically parsed CSV files and use the back-end API.
- Validation of the data consistency
- Handling of errors
- Automatic errors reporting by email
September 2011 – March 2015: Indiefield
Developer / System administrator
Modernisation of legacy environments
After working with a more and more Microsoft oriented environment, I have chosen this position as a “return to the roots” to Open-Source. There was in this company a real desire to leave Microsoft technologies for less vendor lock-in solutions. My experience in both domains has been a decisive criteria for this role. I have managed to migrate all the services to Ubuntu and Debian based environments.
It was a challenging position, with a huge workload. The environment itself was built upon a ten years old structure, using mixed open source and proprietary technologies.
All these applications have been gradually replaced by modern LAMP stacks using agile methodologies, continuous integration and unit testing.
I have first installed a replicated and shared authentication system based, so everyone had to remember only one password. I have then replaced AD by a Samba server, and activated roaming profiles.
The Exchange server has been replaced by three email servers hosted on two different internet connections. Emails were digitally signed and certified. A dual internal / external Jabber server was also included.
I have also deployed solutions allowing remote workers to transparently access all the services from home, like a VPN and a private cloud.
Finally, I have replaced in-house developed intranet with a secure and modern wiki.
Consumer directory migration
The “Consumer Directory” is a web site containing a large base of consumers, who can register themselves. Once registered, they receive invitations to take parts into surveys, after filling screeners.
I have used a Drupal as a content management framework to build a feature rich, modern version, and migrated transparently all the members. All their details have been validated and reformatted, and their initial password kept.
One of the prominent feature was geolocation, allowing a simpler search than just by post codes.
The system has been customised to do mass messaging and automatic bounces management. All emails sent were digitally signed with DKIM and SPF, reducing false classification as SPAM.
Finally, all this stack was checked using continuous integration tests in a real web browser.
Online recruiters and fieldworkers payment system
The company was initially using a Microsoft Access database to store their recruiters details, and was using paper and posts to communicate invoices and payment.
Using the same CMF, I have created an online system, imported the fieldworkers database, and added some useful features:
- Fieldworkers were able to update their details and upload their ID themselves.
- Emails alerts were automatically sent as soon as new jobs were posted.
- Project managers were able to search fieldworkers using various criteria, like domains of expertise, geolocation or personal data.
- Recruiters were able to submit their invoice directly to the system.
- The invoicing process was integrated into a workflow, involving project managers and finance department.
- Each action was logged and emails were sent on specific events.
- Project managers were able to rate recruiters or log misbehaviour. This was then accessible by the rest of the team.
- Statistics and Financial reports by month, recruiter, project, etc. with email alerts on budget exceeding.
The authentication system was transparently integrated with our internal directory server, allowing the project managers to authenticate with their normal password.
Main company web site
To build the new web site, I have used the same framework, but this time as a Content Management System. With a customised responsive theme for mobile devices, the same information was presented differently on phones, tablets and screens.
Using a CMS allowed some departments to add content themselves, which was then validated and aggregated to create new pages. The web site was connected with an analytic system, with weekly and monthly reports sent by email to client services and directors.
I have also integrated of a real time chat system, connected to the client services department.
Some custom modules have been developed, for instance to handle mailing lists unsubscriptions from clients.
Interviews recording system
Phone interviews from the call centre were manually archived and processed by an operator everyday. I have replaced this by a set of Perl scripts thoroughly tested. Compared to manual processing, this was faster and less prone to errors.
The script was able to combine related records together and encode them in different formats, according to client's preferences.
The system was also able to upload records on client's web sites via FTP or to archive them on blu-ray discs at the end of each month.
Finally, an email was generated every night with a summary of events.
Vehicles Registration Research
This web site has been developed to query national vehicle registration services in UK, to obtain all details from a plate number. The information returned was the make and model, the power, etc.
February 2008 – September 2011: Red2 Limited
Senior developer and system administrator
Working environment foundations
The first task in Red2 has been to build the foundations for a proper working environment. I have minimised single points of failure, by using replication on most important services.
I have started by installing two virtualisers for Windows and Linux guests, with disks snapshots and archiving.
I have also installed a VPN server, with two DNS servers and custom domain names for internal development and continuous integration.
An entirely open source email servers has been built to deliver emails with advanced features: Shared and public folders fine grained ACL and an extremely powerful mail filtering system (sieve).
The system has been extended with a shared calendar and address book, using well established standards (caldav and carddav).
The webmail offered access to all settings, like ACL on folders and the Anti-spam policy rules.
A powerful and extensible wiki software and mash-up platform for our internal procedures and documentation.
Finally, I have installed two windows domain controllers, with roaming profiles stored on a shared NAS.
Obviously, the authentication to all these services uses redundant credentials' management servers, with multiple protocols handled.
Continuous integration environment
The second step has been to build a proper continuous environment, in accordance to Agile methodologies. I have used the technologies detailed below:
- Source control servers with replication and scheduled backup, linked with a modern ticket tracking system integrated in Visual Studio.
- Continuous integration server, with a build pipeline, source control polling and emails notifications on major build events.
- Deployment on custom domain names, for each stage of the development (Latest builds, IAT, UAT, Staging, etc.).
- Acceptance tests server, with scheduled tests and email notifications. Tests cases can be written in several languages and drive all major browsers on Windows and Linux.
Hewitt auction project
For this project, I have participated with the team to the design of an extensible auction platform, using distributed web services.
We have used the agile methodologies to create user stories from the requirements, and we have continuously adapted the stories to the time lines.
We have also used modern approaches and techniques in our development. For instance, both the validations rules and the data models were transparently shared by the web services and the user interface projects. This point drastically improved the development time and the simplicity of our development model.
Salvage Direct auction project
Salvage Direct was a salvage auto-mobile auction site that enabled registered and verified users to bid and sell salvaged vehicles.
I have added the comet technology to push bid events in browsers in real time.
I have also developed a separate image server to host all photos from the auction web sites, with advanced features:
- Automatic thumbnail creation on upload.
- Automatic zip files decrompressing on upload.
- Gracefully handle non existant images.
- Secured FTP access for customers.
- Simple REST API to list images.
The administration interface allowed to monitor the transactions and communicate with buyers in real time.
I have intensively used both Ajax and comet technologies, pushed to their limits.
I have also deployed a load balancer in front of the web servers, to dispatch the requests.
National Museums Social Networking Project
The Creative Space project is a social networking application that allow users to search across nine museums and gallery collections.
I have implemented their distributed architecture on nine heterogeneous servers, to allow communication in real time.
The whole system is built on top of Drupal, with both standard and custom modules, specifically developed for the project.
Each instance uses remote procedure calls for exchanging and synchronising content, with automatic detection of offline instances, and off-line replication.
The content exchanged by the instances is encrypted using latest standards for data confidentiality.
Users accounts are replicated and synchronised in real time on all instances, and single sign-on implemented, to allow logged in users to transparently navigate on the whole system.
Web crawling system for Dun & Bradstreet UK
Designing and implementing a custom multi-threaded crawler engine, to extract financial companies contact informations. Featuring:
- Extracting and automatic validation of emails, phone numbers (UK & International rules) and UK postal addresses.
- Anonymous proxies and multithreading for querying public search engines.
The usage of anonymous proxies for querying the search engines is legal, and had been necessary to avoid IP ban from search engines.
Stock exchange and financial data extraction for Dun & Bradstreet UK
Using an internal Dun & Bradstreet crawling technology to collect and analyse International Financial Regulators and Stock Exchange data from more than 40 Countries.
- Web crawling to regulators web sites.
- Collecting and Analasing data persistency.
- Agregating data into a centralized database.
Designing a full replacement using an advanced scripting language.
Insurance web site engine, J.L.T Insurance
Designing and implementing a new insurance web site engine, based on XML complex forms definitions, with both client and server side validation JLT Insurance.
- Creating an abstraction layer on top of the old pricing engine, and exporting it as a web service.
- Designing and implementing the migration from the initial relational database approach to an hybrid approach, using both relationnal and document oriented database (XML).
September 2006 - November 2007: Legal & General
Business reporting application
- Designing and Building a modern front-end application providing Financial Management Information to the senior management and board for Legal & General. The system was fed with data from a Business Object Universe.
- Creating the prototype for the new version of the Business Reporting Application, as a proof of concept based on emergent technologies.
- Designing and implementing the user interface for the internal Balanced Scorecard project.
2000 - 2004 Sole trader Company in France Sinfo
In 2000, I created a sole trader company in France, to provide specialised development and services to companies in my Country. One of my customers, both worked and lived in France and United Kingdom. It was this inspired influence that triggered my decision to live in this country.
Below are some important projects I worked on during these years.
Various developments for Perseus Management Consultancy
- Creation of Bildo open source project, a non obtrusive web component to embeded photos on a web site. It was developped to implement specific features not found on current web galleries.
- Creation of Sollidays, a property rent website, with various features, like administration interface, PDF booking forms generation with rates, real time availabilities updates, etc.
- Creation of the Legal & General Running Club Web site, with shared authentication with phpBB.
- Creation of Tiaret, a north Africa community web site, with a forum, Wiki, and updatable dynamic content.
- Creation of a IFRS forum on a private extranet (International Financial Reporting Standard).
Shipping service system, Frankal, Perpignan
Creation of a full featured program, from package delivery to customer invoices management.
- Automatic data import from partner websites.
- Delivery points visualisation on a graphical map.
- Delivery slip generation, with EAN barcodes.
- Customer address checking, using yellow pages.
- Personal invoice processing system.
- Customer invoice generation.
1996-1999 - High Tech Systems, Montescot, France
C/C++ Analyst Programmer
Now known as Nexeya, HTS was a medium sized French company, specialised in industrial engineering and real-time data acquisition systems.
Various developments for Perseus Management Consultancy
In this company, I worked on small and yet very interesting projects, associated to the defence and aerospace industry. This was my first employment as a programmer, and I have acquired substantial Low-Level Programming skills.