PHP 7: Introducing a domain name validator and making the URL validator stricter

PHP Origami

DNS comes with a set of rules defining valid domain names. A domain name cannot exceed 255 octets (RFC 1034) and each label cannot exceed 63 octets (RFC 1035). It can contain any character (RFC 2181) but extra rules apply for hostnames (A and MX records, data of SOA and NS records): only alphanumeric ASCII characters and hyphens are allowed in labels (we’ll talk about IDNs at the end of this post), and they cannot start nor end with a hyphen.

Until now, there was no PHP’s filter validating that a given a string is a valid domain name (or hostname). Worst,  FILTER_VALIDATE_URL was not fully enforcing domain name validity (this is mandatory for schemes such as http and https) and was allowing invalid URLs. FILTER_VALIDATE_URL was also lacking IPv6 host support.

These limitations will be fixed in PHP 7. I’ve introduced a new FILTER_VALIDATE_DOMAIN filter checking domain name and hostname validity. This new filter is now used internally by the URL validator. I also added IPv6 host support in URL validation:

There is still a big lack in PHP’s domain names and URLs handling: internationalized domain names are not supported at all in the core. I’ve already blogged about an userland workaround, but as IDNs becomes more and more popularsa core support by PHP in streams and validation is necessary. For instance, almost all french registrars support them, and even TLDs – such as the Chinese one – are available in the wild in a non-ASCII form). I’ve started a patch enabling IDN support in PHP’s streams. It works on Unix but still lacks a Windows build system. As it requires making ICU a dependency of PHP, I’ll publish a PHP RFC on this topic soon!

Continuous Integration for Symfony apps, the modern stack: quality checks, private Composer, headless browser testing…

Jenkins / Symfony

Updated 2014/11/21 : Fixed a bug when deploying (thanks @meyer_baptiste). Added a command to update GitHub commit status (thanks @samuelroze).

At Les-Tilleuls.coop, we built great e-commerce platforms and high-performance web applications with quality and agility in mind.

Most of our applications share the same technical stack:

  • the PHP language with the Symfony framework and Doctrine for back applications
  • front applications developed with CoffeeScript or JavaScript, AngularJS and Twitter Bootstrap (Grunt as build system, Yeoman for scaffolding, Karma and Jasmine for tests)
  • private Git repositories (hosted on GitHub)
  • private libraries and bundles shared between applications and exposed as Composer packages
  • phpspec unit tests
  • Doctrine Data Fixtures (enhanced with Alice and Faker)
  • Behat scenarios
  • exhaustive PHPDoc
  • capifony scripts for deployment

I managed to build a pretty cool CI system allowing us to deliver high quality Symfony applications. In this tutorial, you’ll see how it looks and how to install it step by step. If you aren’t using a CI system, it will increase drastically the quality of your Symfony projects and over the time, it will increase your team your development skills.

Our CI system mostly relies on Jenkins. At every commit (on every branch) it:

  • updates our private Composer package repository index
  • runs all tests and scenarios, including web acceptance test trough headless and real browsers
  • runs a bunch of quality checks on the code: security issues, dead code and code duplication detection, cyclomatic complexity analysis, good structuration and following of Symfony’s best practices
  • checks Coding Standards conformance
  • generates a browsable API documentation
  • instantly deploys the new version of the application on testing and sometimes even production servers (continuous deployment)

Finished feature branches are reviewed and merged in the master branch through GitHub Pull Requests by a lead dev only when all tests and checks are green.

The following tutorial can be followed on Debian (wheezy) and Ubuntu.

Installing Jenkins

The Jenkins team maintains a Debian repository providing up to date binary packages. Add it to your CI box and install the package (as root):

If nothing goes wrong, the Jenkins UI is now available on the port 8080 of your CI box. Open the UI in your favorite web browser. So easy.

Securing the installation

Jenkins Security

Currently, anyone knowing the URL of your Jenkins install can take the control over it. It’s a good idea to set credentials required to access the CI:

  1. go to “Manage Jenkins” then “Setup Security”
  2. check “Enable security”
  3. in the “Security Realm” section, select your user provider (“Jenkins’ own user database” is a good default if you don’t have a LDAP directory or any other centralized user provider for your organization)
  4. don’t forget checking “Allow users to sign up”, it’s mandatory to be able to log in after enabling the security
  5. in the “Authorization” section, choose “Project-based Matrix Authorization Strategy”, leave all checkboxes for “anonymous” empty and create a super admin user having all rights
  6. enable CSRF protection by checking “Prevent Cross Site Request Forgery exploits”
  7. save and go back to the homepage
  8. you’re prompted to authenticate, click “Sign up” and create an account with the same username you given administration rights earlier

Jenkins is now secured! Go back to security settings to disable the signup form.

Connecting Jenkins with GitHub repositories

Jenkins must be able to pull data from your privates GitHub repository. The best way I found is to create a GitHub machine user with a read-only access to private repositories. You can add users with write access to your repositories from GitHub by clicking “Settings” then “Collaborators” (the write permission is necessary to update the build status on GitHub. If you don’t care about this feature, just set a read only access to the repo).

Now, we’ll create private and public SSH keys for the UNIX account running Jenkins (on Debian it is called jenkins):

The last step is adding the public key to the GitHub account of the machine user. From GitHub’s homepage, logged in as your machine user account, go to “Settings” then “SSH Keys”. Add a new key with the content of  /var/lib/jenkins/.ssh/id_rsa.pub.

Jenkins is now able to connect to our GitHub repositories.

To update the GitHub commit status, our machine user will also need an access token to the GitHub API. Go to the settings page of the user, click on “Applications” then click on “Generate new token” and be sure to check the “repo:status” permission. Save the token, we’ll use it our build script.

Installing PHP

The PHP interpreter is needed to run our projects tests and other tools such as Composer and Satis. Install it:

apt-get install php5-cli php5-apc php5-sqlite

You noticed that I also installed APC to speed up script execution. I’ve also installed SQLite. I use it as DBMS for the  test  environment of Symfony applications.

Installing Satis

Satis

Satis is the most popular (and the only open source) private Composer package repository. This tool generates a static Composer index. Later, we will add a task in Jenkins to rebuild the repository at each commits. It will allow us to have an always up to date Composer Package Repository. To do that, we must install Satis and make the  jenkins user must able to run it.

Satis must be installed through Composer. Switch to the jenkins user and start by installing Composer, then Satis:

Then, create a config file for Satis called packages.json:

This config file will enable 2 privates Composer packages ( repo1  and repo2 ). Thanks to the SSH keys we configured later, the jenkins  user is able to connect to our privates GitHub repositories. Of course those repositories must have a composer.json file at their root. Full configuration options of Satis is documented on the Composer website.

Generate the package repository manually for the first time:

satis/bin/satis --no-interaction build /var/www/packages.json packages

The next and step is exposing our packages over HTTP. We will use nginx to do it. Switch back to the root user ( Ctrl + d) and type:

# apt-get install nginx

Then change the content of /etc/nginx/sites-enabled/default by the following:

Don’t forget to replace <my-ip>  by the list of IPs allowed to access the package repository. It will prevent exposing your private repository on the whole internet.

Restart nginx:

We should be able to browse the repository through our favorite web browser.

Enabling Git and GitHub support

GitHub Jenkins Webhook

Git and GitHub support are available through plugins. Go to “Manage Jenkins” then “Manage Plugins” and install plugins called “Git Plugin” and “GitHub Plugin”. Their names should be explicit.

Then, we will setup a GitHub Webhook to trigger a build every time a commit is pushed, in any branch.

Go to your GitHub repository, click on “Settings” then “Webhooks & Services”. Click on the “Add service” select box and choose “Jenkins (GitHub plugin)”. Enter your Jenkins’ webhook endpoint in the box, it should be something like http://ci.les-tilleuls.coop:8080/github-webhook/  (just replace the domain part of the URL) and click “Add service”.

The webook is set! Jenkins will be notified every time we push something our GitHub repo. Repeat this step for every GitHub repository you want to trigger builds.

Installing PHP CS Fixer

PHP CS Fixer is an awesome tool provided by SensioLabs. It’s 1.0 version has just been released. It automatically fixes PSR-0PSR-1PSR-2 and Symfony coding standard violations.

With its --dry-run option, it can be used in our CI to check if the produced code is clean.

PHP CS Fixer is packaged as a PHAR archive. Lets install it:

Installing phpDocumentator

phpDocumentator

phpDocumentator generates a beautiful HTML documentation based on DocBlocks documenting your classes and methods.

As for PHP CS Fixer, we will install its PHAR version, but before downloading it, we need to install some extra Debian packages:

Web Acceptance Testing with Behat and PhantomJS

We love Behat and Mink. We use them to manage our user stories and to run automated web testing scenarios.

Standards Symfony controllers are tested with the Symfony2 extension. But nowadays, we create more and more Single Page Applications usually composed of a Symfony 2 backed JSON / REST API and an AngularJS client.

It sounds like a good idea to also test interactions between AngularJS clients and Symfony REST API in the CI system. Behat and Mink will help.

I usually use Mink Selenium2 to drive PhantomJS, an awesome headless browser based on Webkit. Unlike the Symfony2 extension, the PhantomJS needs to access the application trough a public URL. We need a front controller exposing the test environment. Lets write it. We will also configure the built-in web server provided by Symfony to avoid configuring a heavier solution like nginx + PHP FPM.

The first step is to create a new front controller for the test  environment (Symfony comes with front controllers for prod  and dev  environments but not for test ).

Create a new front controller called app_test.php  in the web/  directory your application. It should contain something like the following:

We also need a custom router for the built-in web server. It will allow to use the test env. Create a file in the app/config/  directory of your application called router_test.php . Here is the sample code:

The next step is to install PhantomJS. The PhantomJS package is not available in Debian stable. We will fallback to the binary version provided on the official website:

The behat.yml  file  of your project must be changed to specify the base URL and the Selenium2’s WebDriver host to use. Here is a sample file:

You are ready to write scenarios testing the whole app. They will be executed at each commit! To test consistency across browsers and platforms (mobile devices, exotic systems…), you can take a look to SauceLabs. This SaaS platform is well integrated with Behat and can be a good (but more expensive) replacement to PhantomJS!

Checking code quality with SensioLabs Insight

SensioLabs Insight

SensioLabs published earlier this year a SaaS platform running quality analysis for PHP and especially Symfony projects. That service is called Insight. It’s user friendly, cool to play with (it provides some gamification with a system of “medals”) and finds a lot of bad practices in bundles and Symfony applications.

The first step is to create an account and register your project on the Insight website. Configure your credentials, set the Git repository to analyse and run manually a first analysis. You will probably find some problems in your app that must be fixed!

SensioLabs Insight provides a tool allowing to run new analysis in a CI system. Like other tools we already installed, the insight command is available as a PHAR. Download it as the jenkins user:

Find the uuid of the project you want to analyse with the following command:

The first time you’ll use insight.phar, you’ll be prompted for your user id and API token. You can find them in your SensioLabs Insight account page (“API/SDK” tab). Your credentials will be saved by insight.phar.

We are now able to integrate SensioLabs Insight in the Jenkins script. Write the uuid of your project somewhere, we will need it later.

Insight can output violations in the PMD format. You guessed it, Jenkins has a PMD plugin able to display nicely that sort of output.

Go to “Manage Jenkins”, “Plugin Manager”, click on the “Available” tabs and search for “PMD plugin”. Install it.

SensioLabs Insight is free for public open source projects but a subscription is needed to analyse private projects. I think it’s a good investment but I can’t blame you if you prefer using open source softwares. Sebastian Bergmann (the author of phpunit) maintains a great documentation explaining how to set up open source quality checking tools including PHPMD, phploc, PHP_Depend and phpcpd. Fell free to complete your installation with jenkins-php.org.

Note: at the time I write these lines there is some issues with the PMD output and the fail-condition option of the insight command. I’ve submitted fixes to them and I hope they will be integrated soon in the PHAR file.

Continuous Deployment with Capifony

It’s always good to have a testing server running the last version of the project. Sometimes, we go one step beyond and we let Jenkins pushing stable code in production thanks to Git’s tags (not detailed here).

Anyway, I use a tool you probably already know: capifony. It’s a deployment software specialized for Symfony projects built on top of Capistrano. It’s the easy way to deploy your application. It handle copying the source code to the server using Git, setting correct directory permissions, installing project’s dependencies, running database migration scripts, restarting services such as PHP FPM and many more. If you don’t already use it, give it a try, you will love it.

Like most Ruby applications, capifony is available as a gem. There is nothing easier than installing it on Debian:

Creating the Jenkins project

Jenkins Project

Go back to the Jenkins Dashboard and click on the “create new jobs” button. On the next screen give a name to your project and choose “Freestyle project”.

On the next screen:

  1. fill the “GitHub project” input with the base URL of your GitHub repository

In the “Source Code Management” section:

  1. select “Git” as “Source Code Management”
  2. in “Repository URL”, enter the SSH clone URL of your Git repository (it must be like git@github.com:coopTilleuls/myrepo.git )
  3. click the “Add” button below “Credentials”
  4. choose “SSH Username with private key” as “Kind”
  5. choose “From a file on Jenkins master” for “Private Key” and click “Add” (It allows to use SSH keys we created and added to GitHub in the previous step)
  6. set “Branches to build” to empty (to build all branches of the repository)
  7. select “githubweb” as “Repository browser” and enter the URL of your repo’s homepage again

In “Build triggers”:

  1. check “Build when a change is pushed to GitHub”

In “Build”:

Add a new “Execute shell” build step and use the following script as a template to fit your own needs:

Basically, the script run the various tools we installed earlier and don’t stop when an error occurs. Instead, it collect the return status of commands, and returns at the end of the script if the build is successful or not. It allows to always run all checks, even if one fail. The deployment (if enabled) only occurs if the build is successful. The built-in web server and PhantomJS run in background to allow them working simultaneously with Behat. They are killed after Behat exited.

Don’t fortget to customize value of variables in the top of the script.

Why not a XML file? Because I sometimes use other build servers such as Travis CI and Bamboo. Using a simple shell script allows to easily replace Jenkins with another server. Shell scripts can also be versioned directly in the Git repository.

In “Post-build Actions”:

  1. Add the “Set build status on GitHub commit” step
  2. Add “Publish JUnit test result report” and specify build/unit.xml for “Test report XMLs”
  3. Add “Publish PMD analysis results” and specify build/pmd.xml for “PMD results”
  4. Add “Publish Javadoc” and set build/doc as “Javadoc directory”
  5. Add other actions you’re interested in (like sending mails)

Troubleshooting GitHub API rate errors

If your start getting errors like “Could not fetch https://api.github.com/[…], enter your GitHub credentials to go over the API rate limit” in the console output, it’s because you exceed GitHub API rates for anonymous users. Just open a shell, switch to the Jenkins user, download something with Composer and enter your machine user credentials when prompted:

Because Composer stocks an OAuth in .composer/auth.json , all next calls to GitHub will succeed, even when launched by Jenkins.

Troubleshooting mails

To make Jenkins able to send mails (when a build fails for instance), you need to configure a SMTP server. The easier way is to install a local server on your CI server:

Default options should be OK.

To customize mail parameters, from the Jenkins homepage go to “Manage Jenkins” then “Configure System”. SMTP settings are under the “E-mail notification” section and the address used by Jenkins to send mails in “Jenkins Location”.

 Updating automatically the localy installed tools

As any software, all tools used by our CI server get updates fixing bugs and adding new features. Debian and Jenkins can be updated using apt. Jenkins plugins updates are managed directly trough the Jenkins user interface. But software we installed locally must be updated “by hand”. We will periodically run a small shell script I’ve written to update those tools.

Create the following update-tools.sh script as the jenkins user in its home directory ( ~jenkins):

Don’t forget to make it executable by executing  chmod +x update-tools.sh. To get the updates every night, run  crontab -e and add the following line:

00 00 * * * ~/update-tools.sh

 

You’re CI system is now ready. Your full test suit and quality analysis are running against new produced code in realtime! We also have quality checking and tests specifics for our frontend apps. Maybe If people are interested about that, maybe I’ll write another post detailing our infrastructure.

I’m sure you have your own preferred quality tools and CI best practices for Symfony projects. Share them with us in the comments :)

Internationalized Domain Name (IDN) and PHP

PHP Beer Mug

Currently, PHP doesn’t have any native support of IDN: domains with non-ASCII characters such as http://www.académie-française.fr.

If you try to connect to such site you’ll get nothing but an error:

RFC 3490 specifies that applications must convert IDN in a plain ASCII representation called Punycode before making DNS an HTTP requests. If we use this representation for our domain, it works:

Fortunately, the PHP INTL extension provides a function that converts an IDN to its Punycode representation:  idn_to_ascii

This function must be applied only on the domain part of an URL. It will not work if applied to the whole URL.

Lets leverage  parse_url and PECL HTTP’s \http\Url  making an IDN URL converter that will work in all cases:

Use it to access our IDN URL:

According to this stackoverflow answer and RFC 6125, it should also work for HTTPS connections on IDN. But after some search, I wasn’t able to find any HTTPS enabled IDN domain in the wild. If you’re aware of such setup, please post a comment with the URL!

Work is in progress in the PHP language to get native IDN support in streams and URL validation. Maybe one day I’ll be able to natively connect to classy URLs like http://kévin.dunglas.fr.

In the meantime, you can use this function to access URL containing IDN. On the validation side, the Symfony URL validator has built-in IDN support.

Remember that this function requires INTL and HTTP extensions to work.

Enabling OPcache for PHP 5.6+ installed with Homebrew

PHP

Since 5.5, PHP comes with a built-in OPcache system. This PHP accelerator has been open sourced by Zend and is a good replacement for APC.

If you installed PHP 5.6 on your Mac with Homebrew, you maybe noticed that OPcache is not enabled by default. Even if the extension has been compiled, a manual configuration must be done to enable the extension.

  1. Open /usr/local/etc/php/5.6/php.ini
  2. Add zend_extension=/usr/local/lib/php/extensions/no-debug-non-zts-20131226/opcache.so at the end of this file
  3. Restart your PHP FPM or Apache if you are using mod_php

You’re dev environment is now blazing fast!

Interview à propos de Les-Tilleuls.coop dans J’innove

Les-Tilleuls.coop : un statut Scop qui débride la créativité et l’innovation
Le jeudi 03 Juillet 2014

Créée en 2011, Les-Tilleuls.coop est une agence web spécialisée dans le développement sur-mesure de logiciels e-commerce. Basée à EuraTechnologies, cette Jeune Entreprise Innovante a pris le parti d’organiser son modèle de management selon une organisation coopérative : « à l’origine de la démarche, le mouvement des logiciels libres qui s’appuie sur un esprit communautaire. Je souhaitais recréer cet esprit et l’appliquer dans un contexte entrepreneurial », explique Kévin Dunglas, gérant de l’entreprise et directeur technique.

L’humain est au cœur du projet d’entreprise. L’agence est possédée à 95% par ses salariés et est dirigée par un gérant. Les prises de décision sont ainsi partagées : « chaque salarié coopérateur dispose d’une voix, et ce quel que soit son niveau d’implication financière. Notre mode de fonctionnement se veut démocratique. Tous les salariés participent aux sujets stratégiques de l’entreprise. Les bénéfices sont redistribués équitablement ». Cet esprit libre est aussi une condition à l’embauche : « au bout d’un an d’ancienneté, nous proposons systématiquement au salarié de devenir coopérateur ».

Les effets ?

  • Un salarié fidèle et entrepreneur : « les SSII et les agences web d’une manière générale sont réputées pour leur taux de turn-over important. Ici, nous misons sur l’implication du salarié dans la vie de l’entreprise. Mon but est de redonner du sens au travail ». Cet esprit d’entreprise se ressent au niveau des projets menés pour les clients de l’agence : « un meilleur investissement des salariés, une volonté de toujours mieux faire… ».
  • Un levier d’innovation et de créativité : « Ce statut favorise le partage d’idées. Aussi, j’invite les salariés à travailler fréquemment sur des logiciels libres en dehors de toute contrainte client. Cette liberté leur permet de tester librement de nouvelles applications et ainsi être force de proposition lors de futurs projets à mener ».

Depuis sa création, Les-Tilleuls.coop a toujours été bénéficiaire et des entreprises à forte notoriété lui font confiance : Coyote France, National Geographic, Hema, Virgin Mobile…

Quelques exemples de créations innovantes développées par les coopérateurs Les-Tilleuls.coop :

  • Une plateforme de réservation de voiture accessible via smartphone pour la société Citiz. Un système permet au client d’un opérateur d’auto partagée de garer son véhicule où il le souhaite et au futur utilisateur de repérer en temps réel où la voiture se trouve.
  • Une plateforme e-commerce qui s’adapte selon les canaux de vente pour Alice’s Garden. Dès qu’une commande est passée par un client sur Amazon, PriceMinister…, l’information est remontée au sein d’une application centralisée qui alerte automatiquement le transporteur.

J’innove en Nord Pas de Calais

Les slides du premier sfPot (apéro Symfony) Lillois

Merci d’être venus assister en nombre au premier sfPot qui s’est tenu jeudi dernier à l’Autrement Dit.

Nous y avons discuté de comment réaliser des applications web modernes architecturées autour d’une API REST réalisée avec Symfony et d’un client en JavaScript utilisant des frameworks tels que AngularJS et Backbone.js.

Comme promis, voici les slides réalisées par Alexandre Salomé et moi-même qui y ont été projetées :

Nous espérons vous voir aussi nombreux et enthousiastes lors du prochain évènement qui ne saurait tarder ! Encore merci à Cécile d’avoir organisé cette soirée.

Fuite de données personnelles à Pôle emploi ? Revente d’informations ? Piratage ?

Pôle Emploi

Voici la première contribution externe de ce blog, un article écrit par une amie qui révèle un problème de confidentialité important chez Pôle emploi : les données personnelles des usagers seraient dans la nature, utilisées pour envoyer du spam et probablement pour tenter des usurpations d’identité.

Après une rapide analyse des en-têtes de l’email en question, la piste Pôle emploi se confirme : l’adresse mail du destinataire est écrite entièrement en majuscule, ce qui est un usage peu courant… sauf dans l’interface de gestion des cordonnées du site de Pôle emploi.

Ce sont potentiellement des dizaines de milliers de noms, d’adresses postales et d’adresses emails qui sont dans la nature. Les plaintes d’usagers commencent à se répandre sur la toile, gageons que Pôle emploi aura une réaction adapté à l’ampleur du problème.

Cela fait plusieurs jours que des demandeurs d’emploi reçoivent des mails plutôt louches. Un blog alerte de ces faux entretiens d’embauche, et à voir le nombre de commentaires, on peut penser que de nombreux demandeurs d’emploi sont concernés.

Et j’en fais moi-même partie…

J’ai reçu un mail étrange hier. Il m’indique que suite à ma réponse à une annonce, je suis convoquée à un entretien pour un poste de chargée de clientèle. Le hic, c’est que je n’ai répondu à aucune annonce et qu’en plus je ne cherche absolument pas un poste de chargée de clientèle ! Passée ma surprise, je regarde avec un peu plus d’attention ce mail et plusieurs détails me sautent aux yeux : les tournures de phrase ne sont pas adaptées à ce type de démarche ; les fonctions de la personne que je dois rencontrer et de la personne qui m’envoie le mail ne sont pas précisées ; le mail d’envoi est un mail SFR… et surtout, le cabinet de recrutement qui me contacte n’existe pas, pas plus que l’entreprise qui souhaite si promptement me recruter.

Après m’être beaucoup interrogée sur le pourquoi d’une telle démarche et avoir parcouru les commentaires sur le blog cité ci-dessus, je me dis que l’hypothèse de l’usurpation d’identité est probable. Faire venir des individus en galère à un entretien, leur demander leur carte d’identité pour préparer le contrat qu’il reviendront signer le lendemain et ne jamais les revoir… bref profiter de la situation de galère des chômeurs. Même si ça me paraît quand même carrément tordu…

De toute façon, je n’avais pas prévu d’y aller. En fait, ce qui m’inquiète surtout c’est comment nos données personnelles ont été récupérées. En effet, nos nom, prénom, adresse mail, numéro de téléphone (pour certains), lieu d’habitation (habitant à Lille, je suis convoquée à Roubaix et non à Paris) et situation de chômage sont associés.
A part Pôle Emploi, aucun autre organisme ne dispose de l’ensemble de ces informations (pas la peine de me contredire, j’ai bien réfléchi à la question et, disposant de plusieurs adresses mail, il ne m’est pas difficile de savoir quels organismes disposent de quelles informations).

Alors fuites de la part de Pôle emploi ? Piratage de leur système informatique (suite aux différentes “affaires” qui ont mis en question sa fiabilité, on peut se poser très sérieusement la question) ? Ou même revente de données ?
D’après les témoignages des demandeurs d’emploi concernés par ces démarches, il semblerait que ni Pôle emploi, ni les flics ne s’intéressent au problème… Moi ça m’inquiète, ça m’inquiète que des données personnelles me concernant soient utilisées de manière frauduleuse, surtout quand ces données viennent d’une administration comme Pôle emploi.
Alors quoi ?

En dessous le fameux mail.

De : <annie.afchain@sfr.fr>
Date : 21 mars 2014 14:24
Objet : Proposition d’entretien chez f 2f, Roubaix
À : xxx@xxx.com

UTF-8
Convocation: xxx xxx
Bonjour,

Suite à votre réponse à l’annonce, nous vous demandons de bien vouloir vous présenter chez f 2f pour rencontrer Mr Hurand au 35, Avenue Jean Baptiste Lebas,2ème étage (entre gare SNCF et Mairie), 59100 Roubaix.

Nous fixons donc un rendez-vous de principe pour mercredi 26 mars 2014 à 16h15, mais vous avez la possibilité bien évidemment de le fixer avant Vendredi 28 mars , dernier jour d’entretien pour ce poste de chargé de clientèle, en répondant à mon email.

Nous comptons vivement sur votre présence à l’heure prévue avec votre CV imprimé pour cet entretien individuel.

A. AFCHAIN
SH recrutement

Courriel transmis par le logiciel EMA http://www.emamailing.com
EMA est gratuit pour une utilisation non commerciale

Ce courrier électronique ne contient aucun virus ou logiciel malveillant parce que la protection Antivirus avast! est active.

PHP SocialShare 0.2.1 released

I’ve just published a new version of PHP SocialShare, a library allowing to retrieve server-side webpages’ number of shares and share link from social networks.

This release fixes a bug that was breaking the Google support when the number of shares if greater than 1K and use the brand new phpspec 2.0 stable release as spec system.

PHP SocialShare is available on GitHub and installable trough the awesome Composer.

The SocialShare library

Nouvelle réalisation : la boutique Lost In The Supermarket

Fraichement mise en ligne, voici la boutique de vêtements de mode anglaise Lost In The Supermarket (projet auquel je suis associé).

Elle a été réalisée à l’aide de Prestashop, de Bootstrap et de LESS. Le design est adaptif (responsive) bien que quelques défauts de jeunesse subsistent sur smartphone. Les pages contiennent du balisage sémantique Schema.org afin d’afficher des extraits enrichis dans les résultats des moteurs de recherche. La création graphique est signée Joad Martin.

Lost In The Supermarket

Interview sur Grand Lille TV

Hier, j’ai été interviewé dans l’émission Les rendez-vous de l’éco sur Grand Lille TV pour présenter Les-Tilleuls.coop.