Vidéo : construire des applications API-centric avec API Platform et Symfony

Voici la vidéo de la présentation que j’ai donné lors du Symfony Live 2015 à propos de API Platform, une solution pour construire des applications web modernes reposant sur une API REST hypermedia et auto-découvrable.

Introducing API Platform (beta): the next generation PHP web framework

API Platform

API Platform

Update: API Platform now has a dedicated website with an up to date version of the following tutorial.

PHP celebrates its 20 years this week. In 20 years, the web changed dramatically and is now evolving faster than ever:

PHP.net, Symfony, Facebook and many others have worked hard to improve and professionalize the PHP ecosystem. The PHP world has closed the gap with most backend solutions and is often more innovative than them.

But in critical area I’ve described previously, many things can be improved. Almost all existing solutions are still designed and documented to create websites the old way: a server generate then send plain-old HTML documents to browsers.

What a better gift for the PHP birthday than a brand new set of tools to kickstart modern web projects? Here comes Dunglas’s API platform, a framework for API-first projects built on top of Symfony! Like other modern frameworks such as Zend Framework and Symfony, it’s both a full-stack all-in-one framework and a set of independent PHP components and bundles that can be used separately.

The architecture promoted by the framework will distrust many of you but read until the end and you will see how API Platform make modern development easy and fun again:

  • Start by creating a hypermedia REST API exposing structured data that can be understood by any compliant client such your apps but also as search engines (JSON-LD with Schema.org vocabulary). This API is the central and unique entry point to access and modify data. It also encapsulates the whole business logic.
  • Then create as many clients as you want using frontend technologies you love: an HTML5/Javascript webapp querying the API in AJAX (of course) but also a native iOS or Android app, or even a desktop application. Clients only display data and forms.

Let’s see how to do that. In this tutorial, we will create a typical blog application with API Platform.

If you are in a hurry, a demo is available online and all sources created during this tutorial are available on GitHub:

blog-api-platform

To create the API-side of our project we will:

  • Bootstrap a fully featured and working data model including ORM mapping, validation rules and semantic metadata with the generator provided by API platform (of course you can also handcraft your data model or modify the generated one to fit your needs).
  • Expose this data model trough a read/write (CRUD) API following JSON-LD and Hydra Core Vocabulary open standards, having Schema.org metadata and with a ton of features out of the box: pagination, validation, error serialization, filters and a lot of other awesome features (here too, everything is extensible thanks to a powerful event system and strong OOP).

Then we will develop a tiny AngularJS webapp to illustrate how to create and consume data from the API. Keep in mind that you can use your preferred client-side technology (tested and approved with Angular, React, Ionic, Swift but can work with any language able to send HTTP requests).

Prerequisites

Only PHP 5.5+ must be installed to run Dunglas’s API Platform. A built-in web server is shipped with the framework for the development environment.

To follow this tutorial a database must be installed (but its not a strong dependency of the framework). I recommend MySQL or MariaDB but other major DBMS are supported including SQLite, PostgreSQL, Oracle and SQL server are supported trough Doctrine.

Installing the framework

Let’s start our new blog API project. The easiest way to create a new project is to use Composer (you need to have it installed on your box):

Composer creates the skeleton of the new blog API then retrieve the framework and all its dependencies.

At the end of the installation, you will be prompted for some configuration parameters including database credentials. All configuration parameters can be changed later by editing the app/config/parameters.yml file.

Dunglas’s API Platform is preconfigured to use the popular Doctrine ORM. It is supported natively by all API Platform components. However the Doctrine ORM is fully optional: you can replace it by your favorite ORM, no ORM at all and even no database.

The installer will ask for:

  • mail server credentials (to send mails)
  • the locale of the application
  • the URL of your default web client application to automatically set appropriate CORS headers, set it to http://locahost:9000  (the default URL of the built-in Grunt server) to follow this tutorial
  • a name and a description of the API that will be used in the generated documentation
  • a secret token (choose a long one) for cryptographic features

Take a look at the content of the generated directory. You should recognize a Symfony application directory structure. It’s fine and intended: the generated skeleton is a perfectly valid Symfony full-stack application that follows Symfony Best Practices. It means that you can:

The skeleton comes with a demonstration bookstore API. Remove it:

  • empty  app/config/schema.yml and  app/config/services.yml
  • delete all files in the src/AppBundle/Entity/ directory

Generating the data model

The first incredibly useful tool provided by Dunglas’s API platform is it’s data model generator also know as PHP Schema. This API Platform component can also be used standalone to bootstrap any PHP data model.

To kickstart our blog data model we browse Schema.org and find an existing schema that describe perfectly what we want: https://schema.org/BlogPosting

The  schema command line tool will instantly generate a PHP data model from the Schema.org vocabulary:

Browse Schema.org, choose the types and properties you need (there is a bunch of schemas available), run our code generator. You’re done! You get a fully featured PHP data model including:

  • A set of PHP entities with properties, constants (enum values), getters, setters, adders and removers. The class hierarchy provided by Schema.org will be translated to a PHP class hierarchy with parents as abstract classes. The generated code complies with PSR coding standards.
  • Full high-quality PHPDoc for classes, properties, constants and methods extracted from Schema.org.
  • Doctrine ORM annotation mapping including database columns with type guessing, relations with cardinality guessing, class inheritance (through the @AbstractSuperclass annotation).
  • Data validation through Symfony Validator annotations including data type validation, enum support (choices) and check for required properties.
  • Interfaces and Doctrine ResolveTargetEntityListener support.
  • List of values provided by Schema.org with PHP Enum classes.

Reusing an existing semantic schema has many advantages:

Don’t Reinvent The Wheel

Data models provided by Schema.org are popular and have been proved efficient. They cover a broad spectrum of topics including creative work, e-commerce, event, medicine, social networking, people, postal address, organization, place or review. Schema.org has its root in a ton of preexisting well designed vocabularies and is successfully used by more and more website and applications.

Pick up schemas applicable to your application, generate your PHP model, then customize and specialize it to fit your needs.

Improve SEO and user experience

Adding Schema.org markup to websites and apps increase their ranking in search engines results and enable awesome features such as Google Rich Snippets and Gmail markup.

Mapping your app data model to Schema.org structures can be a tedious task. Using the generator, your data model will be a derived from Schema.org. Serializing your data as JSON-LD will not require specific mapping nor adaptation. It’s a matter of minutes.

Be ready for the future

Schema.org improves the interoperability of your applications. Used with hypermedia technologies such as Hydra it’s a big step towards the semantic and machine readable web. It opens the way to generic web API clients able to extract and process data from any website or app using such technologies.
To generate our data model form Schema.org types, we must create a YAML configuration file for PHP schema:

Then execute the generator:

Take a look at the content of the src/AppBundle/Entity/ directory. PHP Schema generated for us a set of Plain-Old-PHP entities representing our data model. As promised our entities include:

  • type documentation from Schema.org and converted it to PHPDoc
  • Doctrine ORM mapping annotations (including for relations)
  • Symfony validation annotations
  • Schema.org IRI mapping (the @Iri annotations), we will see later that the API bundle use them to expose structured semantic data
  • and they follow the PSR-2 coding style

The data model is fully functional. You can hack it (modify entities, properties, indexes, validation rules…), or use it as is!

Ask Doctrine to create the database of the project:

Then generate database tables related to the generated entities:

PHP Schema provides a lot of configuration options. Take a look at its dedicated documentation. Keep in mind that PHP Schema is also available as a standalone tool (and a PHAR will be available soon) and can be used to bootstrap any PHP project (works fine with raw PHP, API Platform and Symfony but has an extension mechanism allowing to use it with other technologies such as Zend Framework and Laravel).

Sometimes we will have to make a data model with very specific business types, not available in Schema.org. Sometimes we will find Schema.org types that partially matches what we want but needs to be adapted.

Keep in mind that you can always create your very own data model from scratch. It’s perfectly OK and you can still use API Platform without PHP Schema.

Anyway, PHP Schema is a tool intended to bootstrap the data model. You can and you will edit manually generated PHP entities. When you start to edit manually the generated files, be careful to not run the generator again, it will overwrite your changes (this behavior will be enhanced in future versions). When you do such things, the best to do is to remove dunglas/php-schema  from  your composer.json  file.

Exposing the API

We have a working data model backed by a database. Now we will create a hypermedia REST API thanks to another component of Dunglas’s API Platform: DunglasApiBundle.

As PHP Schema, it is already preinstalled and properly configured. We just need to declare resources we want to expose.

Exposing a resource collection basically consist to register a new Symfony service. For our blog app we will expose trough the API the two entity classes generated by PHP Schema: BlogPosting (blog post) and Person  (author of the post):

And our API is already finished! How would it be easier?

Start the integrated development web server:  app/console server:start

Then open http://localhost:8000/doc with a web browser:

API Platform doc

Thanks to NelmioApiDocBundle support of DunglasApiBundle and its integration with API Platform, you get for a free an automatically generated human-readable documentation of the API (Swagger-like). The doc also includes a sandbox to try the API.

You can also use your favorite HTTP client to query the API. I strongly recommend Postman. It is lower level than the sandbox and will allow to inspect forge and inspect JSON requests and responses easily.

Open  http://localhost:8000 with Postman. This URL is the entry point of the API. It gives to access to all exposed resources. As you can see, the API returns minified JSON-LD. For better readability, JSON snippets have been prettified in this document.

Trying the API

Add a person named Kévin by issuing a POST request on  http://localhost:8000/users  with the following JSON document as raw body:

The data is inserted in database. The server replies with a JSON-LD representation of the freshly created resource. Thanks to PHP Schema, the  @type  property of the JSON-LD document is referencing a Schema.org type:

The JSON-LD spec is fully supported by the bundle. Want a proof? Browse http://localhost:8000/contexts/Person .

By default, the API allows GET (retrieve, on collections and items), POST (create), PUT (update) and DELETE (self-explaining) HTTP methods. You can add and remove any other operation you want. Try it!

Now, browse http://localhost:8000/people :

Pagination is also supported (and enabled) out of the box.

It’s time to post our first article. Run a POST request on http://locahost:8000/blog_posting with the following JSON document as body:

Oops… the isFamilyFriendly  property is a boolean. Our JSON contains an incorrect string. Fortunately the bundle is smart enough to detect the error: it uses Symfony validation annotations generated by PHP Schema previously. It returns a detailed error message in the Hydra error serialization format:

Correct the body and send the request again:

We fixed it! By the way you learned how to to work with relations. In a hypermedia API, every resource is identified with an unique IRI (an URL is a an IRI). They are in the @id  property of every JSON-LD document generated by the API and you can use it as reference to set relations like we done in the previous snippet for the author property.

Dunglas’s API Platform is smart enough to understand any date format supported by PHP date functions. In production we recommend the format specified by the RFC 3339.

We already have a powerful hypermedia REST API (always without writing a single line of PHP), but there is more.

Our API is auto-discoverable. Open http://localhost:8000/vocab and take a look at the content. Capabilities of the API are fully described in a machine-readable format: available resources, properties and operations, description of elements, readable and writable properties, types returned and expected…

As for errors, the whole API is described using the Hydra Core Vocabulary, an open web standard for describing hypermedia REST APIs in JSON-LD. Any Hydra-compliant client or library is able to interact with the API without knowing anything about it! The most popular Hydra client is Hydra Console. Open an URL of the API with it you’ll get a nice management interface.

API Platform Hydra console

You can also give a try to the brand new hydra-core Javascript library.

DunglasApiBundle offers a lot of other features including:

Read its dedicated documentation to see how to leverage them and how to hook your own code everywhere into it.

Specifying and testing the API

Behat (a Behavior-driven development framework) is preconfigured with contexts useful to spec and test REST API and JSON documents.

With Behat, you can write the API specification (as user stories) in natural language then execute scenarios against the application to validate its behavior.

Create a Gherkin feature file containing the scenarios we run manually in the previous chapter:

The API Platform flavor of Behat also comes with a temporary SQLite database dedicated to tests. It works out of the box.

Just run bin/behat . Everything should be green:

Then you get a powerful hypermedia API exposing structured data, specified and tested thanks to Behat. And still without a line of PHP!

It’s incredibly useful for prototyping and Rapid Application Development (RAD). But the framework is designed to run in prod. It benefits from strong extension points and is has been optimized for very high-traffic websites (API Platform powers the new version of a major world-wide media site).

Other features

Dunglas’s API Platform has a lot of other features and can extended with PHP libraries and Symfony bundles. Stay tuned, more documentation and cookbooks are coming!

Here is a non exhaustive list of what you can do with API Platform:

The Angular client

Prequistes: Node.js and NPM must be installed.

Dunglas’s API Platform is agnostic of the client-side technology. You can use whatever platform, language or framework you want. As an illustration of doors opened by an API-first architecture and the ease of development of SPA, we will create a tiny AngularJS client.

Keep in mind that this is not an AngularJS tutorial and it doesn’t pretend to follow AngularJS best practices. It’s just an example of how simple application development become when all the business logic is encapsulated in an API. The unique responsibility of our AngularJS blog client is to display informations retrieved by the API (presentation layer).

The AngularJS application is fully independent of the API. It’s a HTML/JS/CSS app querying an API trough AJAX. It leaves in its own Git repository and is hosted on its own web server. As it only contains assets, it can even be hosted directly on a CDN such as Amazon CloudFront or Akamai.

To scaffold our AngularJS app we will use the official Angular generator of Yeoman. Install the generator then generate a client skeleton:

Yeoman will ask some questions. We want to keep the app minimal:

  • don’t install Sass support
  • install Twitter Bootstrap (it’s optional but the app will look better)
  • uncheck all suggested angular modules

However, we will install Restangular, an awesome REST client for Angular that fit well with API Platform:

The Angular generator comes with the Grunt build tool. It compiles assets (minification, compression) and can serve the web app with an integrated development web server. Start it:

DunglasApiBundle provides a Restangular integration guide in its documentation. Once configured, Restangular will work with the JSON-LD/Hydra API like with any other standard REST API.

Then edit app/app.js file to register Restangular and configure it properly:

Be sure to change the base URL with the one of your API in production (protip: use grunt-ng-constant).

And here is the controller retrieving the posts list and allowing to create a new one:

As you can see, querying the API with Restangular is easy and very intuitive. The library automatically issues HTTP requests to the server and hydrates “magic” JavaScript objects corresponding to JSON responses that can be manipulated to modify remote resources (trough PUT , POST  and DELETE requests).

We also leverage the server side error handling to display beautiful messages when submitted data are invalid or when something goes wrong.

And the corresponding view looping over posts and displaying the form and errors:

It’s the end of this tutorial. Our blog client is ready and working! Remember of the old way of doing synchrone web apps? What do you think of this new approach? Easier and very much powerful isn’t it?

Of course there are some tasks to have a finished client including routing, pagination support and injecting raw JSON-LD in the generated HTML for search engines (use the response extractor hook provided by Restangular). As it’s only Angular tasks, it’s out of scope of this introduction to API Platform. But it’s a good exercise to add such features to the client. Feel free to share your snippets in comments.

Convinced of the facility of development and of the power of API Platform? Let us know your feedback in comments!

Roadmap

The first beta is ready and there is some task to do before the first stable release:

  • The code is almost ready and already used awesome compagnies including Les-Tilleuls.coop, Smile, the Trip Advisor group and ExaqtWorld. But API Platform still need to bullet proof. The first task before the release is: testing and hunting bugs.
  • The documentation must be enhanced. This tutorial will be the base of the brand new doc being written. A dedicated website is also in progress.
  • We also need more integration (doc or libraries) with popular frontend technologies!

I hope to release the first stable version in July. You can help by contributing (bug reports, docs, code). Everything is hosted on GitHub and Pull Requests are very welcome.

Thanks

Special thanks to Samuel Roze (formerly Les-Tilleuls.coop and now Inviqa), Théo Fidry and Fabien Gasser (Smile), Maxime Steinhausser and Jérémy Derussé (SensioLabs) and Michaël Labrut (La Fourchette) for contributing code, docs and ton of good ideas.

API Platform would never have emerged without the great technologies and libraries it uses as foundation, especially Symfony, Doctrine JSON-LD, Hydra and Schema.org. A big thanks to all their contributors.

Commercial support

Feared of trying this new framework? Commercial support is already available:

  • Contact me for application design and architecture consulting.
  • Les-Tilleuls.coop can develop your softwares based on API Platform (and Symfony) and its team of great engineers do training and consulting.

Continuous Integration for Symfony apps, the modern stack: quality checks, private Composer, headless browser testing…

Jenkins / Symfony

Updated 2014/11/21 : Fixed a bug when deploying (thanks @meyer_baptiste). Added a command to update GitHub commit status (thanks @samuelroze).

At Les-Tilleuls.coop, we built great e-commerce platforms and high-performance web applications with quality and agility in mind.

Most of our applications share the same technical stack:

  • the PHP language with the Symfony framework and Doctrine for back applications
  • front applications developed with CoffeeScript or JavaScript, AngularJS and Twitter Bootstrap (Grunt as build system, Yeoman for scaffolding, Karma and Jasmine for tests)
  • private Git repositories (hosted on GitHub)
  • private libraries and bundles shared between applications and exposed as Composer packages
  • phpspec unit tests
  • Doctrine Data Fixtures (enhanced with Alice and Faker)
  • Behat scenarios
  • exhaustive PHPDoc
  • capifony scripts for deployment

I managed to build a pretty cool CI system allowing us to deliver high quality Symfony applications. In this tutorial, you’ll see how it looks and how to install it step by step. If you aren’t using a CI system, it will increase drastically the quality of your Symfony projects and over the time, it will increase your team your development skills.

Our CI system mostly relies on Jenkins. At every commit (on every branch) it:

  • updates our private Composer package repository index
  • runs all tests and scenarios, including web acceptance test trough headless and real browsers
  • runs a bunch of quality checks on the code: security issues, dead code and code duplication detection, cyclomatic complexity analysis, good structuration and following of Symfony’s best practices
  • checks Coding Standards conformance
  • generates a browsable API documentation
  • instantly deploys the new version of the application on testing and sometimes even production servers (continuous deployment)

Finished feature branches are reviewed and merged in the master branch through GitHub Pull Requests by a lead dev only when all tests and checks are green.

The following tutorial can be followed on Debian (wheezy) and Ubuntu.

Installing Jenkins

The Jenkins team maintains a Debian repository providing up to date binary packages. Add it to your CI box and install the package (as root):

If nothing goes wrong, the Jenkins UI is now available on the port 8080 of your CI box. Open the UI in your favorite web browser. So easy.

Securing the installation

Jenkins Security

Currently, anyone knowing the URL of your Jenkins install can take the control over it. It’s a good idea to set credentials required to access the CI:

  1. go to “Manage Jenkins” then “Setup Security”
  2. check “Enable security”
  3. in the “Security Realm” section, select your user provider (“Jenkins’ own user database” is a good default if you don’t have a LDAP directory or any other centralized user provider for your organization)
  4. don’t forget checking “Allow users to sign up”, it’s mandatory to be able to log in after enabling the security
  5. in the “Authorization” section, choose “Project-based Matrix Authorization Strategy”, leave all checkboxes for “anonymous” empty and create a super admin user having all rights
  6. enable CSRF protection by checking “Prevent Cross Site Request Forgery exploits”
  7. save and go back to the homepage
  8. you’re prompted to authenticate, click “Sign up” and create an account with the same username you given administration rights earlier

Jenkins is now secured! Go back to security settings to disable the signup form.

Connecting Jenkins with GitHub repositories

Jenkins must be able to pull data from your privates GitHub repository. The best way I found is to create a GitHub machine user with a read-only access to private repositories. You can add users with write access to your repositories from GitHub by clicking “Settings” then “Collaborators” (the write permission is necessary to update the build status on GitHub. If you don’t care about this feature, just set a read only access to the repo).

Now, we’ll create private and public SSH keys for the UNIX account running Jenkins (on Debian it is called jenkins):

The last step is adding the public key to the GitHub account of the machine user. From GitHub’s homepage, logged in as your machine user account, go to “Settings” then “SSH Keys”. Add a new key with the content of  /var/lib/jenkins/.ssh/id_rsa.pub.

Jenkins is now able to connect to our GitHub repositories.

To update the GitHub commit status, our machine user will also need an access token to the GitHub API. Go to the settings page of the user, click on “Applications” then click on “Generate new token” and be sure to check the “repo:status” permission. Save the token, we’ll use it our build script.

Installing PHP

The PHP interpreter is needed to run our projects tests and other tools such as Composer and Satis. Install it:

apt-get install php5-cli php5-apc php5-sqlite

You noticed that I also installed APC to speed up script execution. I’ve also installed SQLite. I use it as DBMS for the  test  environment of Symfony applications.

Installing Satis

Satis

Satis is the most popular (and the only open source) private Composer package repository. This tool generates a static Composer index. Later, we will add a task in Jenkins to rebuild the repository at each commits. It will allow us to have an always up to date Composer Package Repository. To do that, we must install Satis and make the  jenkins user must able to run it.

Satis must be installed through Composer. Switch to the jenkins user and start by installing Composer, then Satis:

Then, create a config file for Satis called packages.json:

This config file will enable 2 privates Composer packages ( repo1  and repo2 ). Thanks to the SSH keys we configured later, the jenkins  user is able to connect to our privates GitHub repositories. Of course those repositories must have a composer.json file at their root. Full configuration options of Satis is documented on the Composer website.

Generate the package repository manually for the first time:

satis/bin/satis --no-interaction build /var/www/packages.json packages

The next and step is exposing our packages over HTTP. We will use nginx to do it. Switch back to the root user ( Ctrl + d) and type:

# apt-get install nginx

Then change the content of /etc/nginx/sites-enabled/default by the following:

Don’t forget to replace <my-ip>  by the list of IPs allowed to access the package repository. It will prevent exposing your private repository on the whole internet.

Restart nginx:

We should be able to browse the repository through our favorite web browser.

Enabling Git and GitHub support

GitHub Jenkins Webhook

Git and GitHub support are available through plugins. Go to “Manage Jenkins” then “Manage Plugins” and install plugins called “Git Plugin” and “GitHub Plugin”. Their names should be explicit.

Then, we will setup a GitHub Webhook to trigger a build every time a commit is pushed, in any branch.

Go to your GitHub repository, click on “Settings” then “Webhooks & Services”. Click on the “Add service” select box and choose “Jenkins (GitHub plugin)”. Enter your Jenkins’ webhook endpoint in the box, it should be something like http://ci.les-tilleuls.coop:8080/github-webhook/  (just replace the domain part of the URL) and click “Add service”.

The webook is set! Jenkins will be notified every time we push something our GitHub repo. Repeat this step for every GitHub repository you want to trigger builds.

Installing PHP CS Fixer

PHP CS Fixer is an awesome tool provided by SensioLabs. It’s 1.0 version has just been released. It automatically fixes PSR-0PSR-1PSR-2 and Symfony coding standard violations.

With its --dry-run option, it can be used in our CI to check if the produced code is clean.

PHP CS Fixer is packaged as a PHAR archive. Lets install it:

Installing phpDocumentator

phpDocumentator

phpDocumentator generates a beautiful HTML documentation based on DocBlocks documenting your classes and methods.

As for PHP CS Fixer, we will install its PHAR version, but before downloading it, we need to install some extra Debian packages:

Web Acceptance Testing with Behat and PhantomJS

We love Behat and Mink. We use them to manage our user stories and to run automated web testing scenarios.

Standards Symfony controllers are tested with the Symfony2 extension. But nowadays, we create more and more Single Page Applications usually composed of a Symfony 2 backed JSON / REST API and an AngularJS client.

It sounds like a good idea to also test interactions between AngularJS clients and Symfony REST API in the CI system. Behat and Mink will help.

I usually use Mink Selenium2 to drive PhantomJS, an awesome headless browser based on Webkit. Unlike the Symfony2 extension, the PhantomJS needs to access the application trough a public URL. We need a front controller exposing the test environment. Lets write it. We will also configure the built-in web server provided by Symfony to avoid configuring a heavier solution like nginx + PHP FPM.

The first step is to create a new front controller for the test  environment (Symfony comes with front controllers for prod  and dev  environments but not for test ).

Create a new front controller called app_test.php  in the web/  directory your application. It should contain something like the following:

We also need a custom router for the built-in web server. It will allow to use the test env. Create a file in the app/config/  directory of your application called router_test.php . Here is the sample code:

The next step is to install PhantomJS. The PhantomJS package is not available in Debian stable. We will fallback to the binary version provided on the official website:

The behat.yml  file  of your project must be changed to specify the base URL and the Selenium2’s WebDriver host to use. Here is a sample file:

You are ready to write scenarios testing the whole app. They will be executed at each commit! To test consistency across browsers and platforms (mobile devices, exotic systems…), you can take a look to SauceLabs. This SaaS platform is well integrated with Behat and can be a good (but more expensive) replacement to PhantomJS!

Checking code quality with SensioLabs Insight

SensioLabs Insight

SensioLabs published earlier this year a SaaS platform running quality analysis for PHP and especially Symfony projects. That service is called Insight. It’s user friendly, cool to play with (it provides some gamification with a system of “medals”) and finds a lot of bad practices in bundles and Symfony applications.

The first step is to create an account and register your project on the Insight website. Configure your credentials, set the Git repository to analyse and run manually a first analysis. You will probably find some problems in your app that must be fixed!

SensioLabs Insight provides a tool allowing to run new analysis in a CI system. Like other tools we already installed, the insight command is available as a PHAR. Download it as the jenkins user:

Find the uuid of the project you want to analyse with the following command:

The first time you’ll use insight.phar, you’ll be prompted for your user id and API token. You can find them in your SensioLabs Insight account page (“API/SDK” tab). Your credentials will be saved by insight.phar.

We are now able to integrate SensioLabs Insight in the Jenkins script. Write the uuid of your project somewhere, we will need it later.

Insight can output violations in the PMD format. You guessed it, Jenkins has a PMD plugin able to display nicely that sort of output.

Go to “Manage Jenkins”, “Plugin Manager”, click on the “Available” tabs and search for “PMD plugin”. Install it.

SensioLabs Insight is free for public open source projects but a subscription is needed to analyse private projects. I think it’s a good investment but I can’t blame you if you prefer using open source softwares. Sebastian Bergmann (the author of phpunit) maintains a great documentation explaining how to set up open source quality checking tools including PHPMD, phploc, PHP_Depend and phpcpd. Fell free to complete your installation with jenkins-php.org.

Note: at the time I write these lines there is some issues with the PMD output and the fail-condition option of the insight command. I’ve submitted fixes to them and I hope they will be integrated soon in the PHAR file.

Continuous Deployment with Capifony

It’s always good to have a testing server running the last version of the project. Sometimes, we go one step beyond and we let Jenkins pushing stable code in production thanks to Git’s tags (not detailed here).

Anyway, I use a tool you probably already know: capifony. It’s a deployment software specialized for Symfony projects built on top of Capistrano. It’s the easy way to deploy your application. It handle copying the source code to the server using Git, setting correct directory permissions, installing project’s dependencies, running database migration scripts, restarting services such as PHP FPM and many more. If you don’t already use it, give it a try, you will love it.

Like most Ruby applications, capifony is available as a gem. There is nothing easier than installing it on Debian:

Creating the Jenkins project

Jenkins Project

Go back to the Jenkins Dashboard and click on the “create new jobs” button. On the next screen give a name to your project and choose “Freestyle project”.

On the next screen:

  1. fill the “GitHub project” input with the base URL of your GitHub repository

In the “Source Code Management” section:

  1. select “Git” as “Source Code Management”
  2. in “Repository URL”, enter the SSH clone URL of your Git repository (it must be like git@github.com:coopTilleuls/myrepo.git )
  3. click the “Add” button below “Credentials”
  4. choose “SSH Username with private key” as “Kind”
  5. choose “From a file on Jenkins master” for “Private Key” and click “Add” (It allows to use SSH keys we created and added to GitHub in the previous step)
  6. set “Branches to build” to empty (to build all branches of the repository)
  7. select “githubweb” as “Repository browser” and enter the URL of your repo’s homepage again

In “Build triggers”:

  1. check “Build when a change is pushed to GitHub”

In “Build”:

Add a new “Execute shell” build step and use the following script as a template to fit your own needs:

Basically, the script run the various tools we installed earlier and don’t stop when an error occurs. Instead, it collect the return status of commands, and returns at the end of the script if the build is successful or not. It allows to always run all checks, even if one fail. The deployment (if enabled) only occurs if the build is successful. The built-in web server and PhantomJS run in background to allow them working simultaneously with Behat. They are killed after Behat exited.

Don’t fortget to customize value of variables in the top of the script.

Why not a XML file? Because I sometimes use other build servers such as Travis CI and Bamboo. Using a simple shell script allows to easily replace Jenkins with another server. Shell scripts can also be versioned directly in the Git repository.

In “Post-build Actions”:

  1. Add the “Set build status on GitHub commit” step
  2. Add “Publish JUnit test result report” and specify build/unit.xml for “Test report XMLs”
  3. Add “Publish PMD analysis results” and specify build/pmd.xml for “PMD results”
  4. Add “Publish Javadoc” and set build/doc as “Javadoc directory”
  5. Add other actions you’re interested in (like sending mails)

Troubleshooting GitHub API rate errors

If your start getting errors like “Could not fetch https://api.github.com/[…], enter your GitHub credentials to go over the API rate limit” in the console output, it’s because you exceed GitHub API rates for anonymous users. Just open a shell, switch to the Jenkins user, download something with Composer and enter your machine user credentials when prompted:

Because Composer stocks an OAuth in .composer/auth.json , all next calls to GitHub will succeed, even when launched by Jenkins.

Troubleshooting mails

To make Jenkins able to send mails (when a build fails for instance), you need to configure a SMTP server. The easier way is to install a local server on your CI server:

Default options should be OK.

To customize mail parameters, from the Jenkins homepage go to “Manage Jenkins” then “Configure System”. SMTP settings are under the “E-mail notification” section and the address used by Jenkins to send mails in “Jenkins Location”.

 Updating automatically the localy installed tools

As any software, all tools used by our CI server get updates fixing bugs and adding new features. Debian and Jenkins can be updated using apt. Jenkins plugins updates are managed directly trough the Jenkins user interface. But software we installed locally must be updated “by hand”. We will periodically run a small shell script I’ve written to update those tools.

Create the following update-tools.sh script as the jenkins user in its home directory ( ~jenkins):

Don’t forget to make it executable by executing  chmod +x update-tools.sh. To get the updates every night, run  crontab -e and add the following line:

00 00 * * * ~/update-tools.sh

 

You’re CI system is now ready. Your full test suit and quality analysis are running against new produced code in realtime! We also have quality checking and tests specifics for our frontend apps. Maybe If people are interested about that, maybe I’ll write another post detailing our infrastructure.

I’m sure you have your own preferred quality tools and CI best practices for Symfony projects. Share them with us in the comments 🙂

DunglasAngularCsrfBundle: protect your Symfony / AngularJS apps against CSRF attacks

I create and I see more and more web applications sharing the same powerful architecture:

 These components share the same philosophy (built on top of dependency injection and MVC-like patterns, designed to be intensively tested) and play very well together.

This stack allows to create awesome blazing-fast web applications. Better, the client part and the server part of the app are loosely coupled, can evolve separately and can even be maintained by different teams.

However, this kind of apps often suffer of security problems, and especially Cross-site Request Forgery (CSRF or XSRF) vulnerabilities.

Both Symfony and AngularJS provide their own CSRF protection mechanisms, but by default they are not interoperable and not enabled. Thanks to a recent refactoring of the Symfony’s security component, it’s now possible and clean to make both systems working together, and I’ve just released an open source bundle to do that: DunglasAngularCsrfBundle.

This bundle provides out of the box CSRF protection for AngularJS apps interacting with a Symfony-backed app.

Despite it’s name, it does not depend of AngularJS and can also be used with Chaplin.js / Backbone.js, jQuery or even raw JavaScript. To do so, install and configure the bundle, then just add to XHR requests a HTTP header called X-XSRF-TOKEN containing the value of the token set by a cookie on the first HTTP request. The bundle will automatically check the validity of the provided token. If it is not valid, an Access Denied error (HTTP 401) will be thrown.

The bundle is fully tested with phpspec and obtain a platinum medal on the brand new (awesome) SensioLabs Insight quality monitoring system.

Internals documentation and installation instructions are provided on the GitHub page of the bundle. Check it, test it, star it and tell me what you think of it!

Download DunglasAngularCsrfBundle on GitHub.