Una nueva época para el desarrollo

Hace unos meses, Tarek Ziadé ha escrito un ensayo muy interesante en su blog: A new development era. El resumen es: las tecnologías web (HTML5/JS) están ganando espacio e importancia a la hora de construir aplicaciones complejas en cliente (escritorio, navegador, teléfono, tablet) mientras que el servidor se está convirtiendo en un proxy a servicios ligeros con los que el cliente interactúa.

2000_2012 2015_

La lectura de este post me cogió en un estado mental propicio para empatizar, ya que llevaba varios meses construyendo una aplicación Javascript con Backbone que tiraba contra servicios JSON hechos en python/pyramid. Pasada la emoción inicial por verme reflejado en el post, me he dado cuenta de que la idea está más extendida de lo que yo creía: no son sólo los early adopters ya, sino también los big players de la industria de escritorio los que permiten hacer aplicaciones javascript (WindowsGNOME) e ¡incluso la vieja guardia dedicada a construir aplicaciones java en servidor! Quizás sea una nueva vuelta del péndulo. Quizás, que la promesa de aplicaciones multiplataforma que funcionan en múltiples entornos es atractiva. Lo que es seguro, es que la siguiente hornada de aplicaciones se harán de esa manera.

A new development era

Tarek Ziadé has posted a few months ago an interesting essay on his blog: A new development era. Summing up: web technologies (HTML5, JS) are gaining importance to build complex apps in the client (whatever it is: desktop, browser, phone, tablet) and the server side is becoming a proxy of lightweight services to interact with.

2000_2012 2015_

The post resonated with me due to the fact that my work during last months was to build a rich client app in Javascript with lightweight JSON services built in python. But, as far as I’ve seen it, this tendency is more spread than I thought: it’s not only happening within early adopters, but also within big players in the desktop realm (Windows, GNOME) and the old-school java server applications. Maybe is a new swing of the pendulum, or just that the promise of cross-platform apps that just work in multiple environments is appealing. What is certain, is that the next million apps seems to go towards that tendency.

(Geo) Database evolution while developing

During last year, I followed with interest the different approaches on how to evolve the design of a database being discussed within the postgresql community. Following is my take on that one: how this year I developed a project with an intense evolving DB design using an agile approach.

The context

My requisites for this project were twofold:

  • An evolving DB design: at the beginning of the project I didn’t know how the DB design was to going to be. I had set to use some advanced techniques for data modeling which never had used in production (dynamic segmentation and linear referencing with PostgreSQL/PostGIS) and needed an approach which supported my evolving understanding of the domain.
  • Intense collaboration with analists: the project needed some intense work on data-processing to polish and create the data for the application. I knew this was to be an iterative process where both developers and analists would collaborate together to define and clarify the model we needed.

My approach

So, in the process of improving and automating my delivery pipeline, I set some rules for the project:

  • DB management through SQL and control versioning: the database was created from DDL scripts and data was stored as CSV (if alphanumeric) or SQL (generated from Shapefiles to store geographical information).
  • Application and database evolve together: so their code should too, which in practice means I put the app and DB directories/projects under the same git repo.
  • Test driven development: I needed to break the problem in small chunks I could deal with, while my understanding of the domain improved. Besides, when refactoring the DB (schemas, triggers, functions, etc) -which happened frequently- I needed to know all the pieces were working OK. I decided to use pgTap for that.

And how it turned out?

  • The pipeline worked smoothly: both the analists and developers were working in their confort zone with the proper tools; desktop GIS applications the formers, command-line and SQL the laters.
  • git provides an excelent mechanism for versioning text, so I had powerful tools at hand for versioning SQL structure and data (diff, cherry-pick, interative rebases, etc). Besides, see where the data was varying (name and type of fields, its values, etc) allowed us to early discovered some bugs and problems.
  • Database and application evolving to the same pace. By tagging the versions we can build in seconds the binaries needed for any version of the application with the proper DB.
  • Tests at DB level are a life-saver. pgTap allowed me to refactor the database whith no risk and a lot of confidence on what I was doing. I had all kind of tests: check if a trigger is launch if an UPDATE happens, a function is working, data integrity and model validation after the initial restore, etc.
  • Same process for deplying to developing, staging and production environments, which resulted in fewer errors and no panic-moments.
  • Having the data in the repo and regenerating BD from scracth was very comfy and quick (less than a minute in my laptop the whole DB: 100Mb of raw SQL) and similar numbers when deploying to stage through the wire. In a daily bases I only had to regenerate specific schemas of the DB, so waitings was an order of seconds.

Coda

We should consider the database as other deliverable to our clients and set the same quality, standards and methodology to develop it. In that sense, agile philosophy and practices match very well with the DB evolution.

At the tools level, I was reluctant to introduce new tools and steps I didn’t know very well in such a tight schedule, so I decided to stick to the basic and spartan (git repo, shell scripts, pgTap and SQL), then iterate and grow a solution for our specific case. Although I missed some refactoring tools, it turned out to be a good approach and now I´m in good position to know the tradeoffs of the process, which in next projects will help me to choose a specialized tool, if necessary.

%d bloggers like this: