Logo BID
  • Historias
  • Datos para el desarrollo
18 Ene 2015

Good Ideas for Government 2.0 (Part Three)

Yo Gobierno

Following up on this series, in which we highlight the trends of the stories we share at YoGobierno, part three and four focus more on upcoming trends in the intersection of technology, government and citizen participation. You’ll see that most of these are ideas that we have yet to implement in the region, but you’ll also see that there are many cutting edge experiences among us.

Catch up with Part One and Part Two.


11. Chief Data Officer

Around 2003, we began to see that enterprises, especially in the private sector, started recognizing that working with data—managing it, analyzing it and publishing it—had earned significant importance, enough to merit a top executive position level. As information technology climbed to the top of the organizational chart, the “Data Scientists” as we know them today, reached the same recognition. Turn the clock forward another decade, and after the explosion of Open and Big Data, we started seeing that governments were appointing Chief Data Officer, starting at the local governments level.

In 2011, in the city of New York, Rachel Haot was the first person appointed in this role in local government. Philadelphia and San Francisco followed along, confirming a trend that we will surely continue seeing for all the great metropolis during this decade. This position is necessary, insofar local governments walk the path of becoming Smart Cities, a process that requires constant analysis of data to control, predict and react to cities outputs and guiding them into being efficient, friendly and sustainable.

Working with data is not limited to analysis. In an environment of increased transparency and  speed, managing data is also key for Open Governments—Record Management has to be part of government plans, considering that data cannot be limited to internal use only. In addition, notions like Open Data by Default (as mentioned previously), the use of Application Program Interfaces (APIs), and the need for data availability on a real-time basis all mean that automated processes, and people in charge of them, are key. And for that, a person guiding the process and structures is needed.

12. APIs

Picking up on the previous point, more and more governments understand the need to not only publish data, but to do so in real time while providing results, analyses and processes rather than just raw data. There is data that government releases that only makes sense as far as it is provided in real time.

As an example, consider meteorological data. The mass data records of rainfall, temperature and wind speed are a great tool for searching patterns over time, and can obviously be used by various organizations in areas such as agriculture and fishing, air and sea navigation, etc. However, to make actual use of these data, two things are needed: a very rapid or real-time publication, and a large capacity for data processing to generate climate models. That’s where it makes sense to not only publish data in real time, but also to provide an open service (API) that allows for data processing. Thus for example, a person can use a mobile application that provides the weather after the data was processed by meteorology organizations.

When we speak of APIs (Application Programing Interface), we are talking about the release of data to be read not by  people, but by computer applications. When a government provides a service to an application in which you can consult the time in which a bus is expected to arrive at the bus stop, it is providing the most updated and processed information—saving the “consulting” computer storage and processing energy.

This service model (which in many cases are services in the cloud) opens some interesting questions, including whether a government could or should charge for these. An interesting response is given by Guillermo Moncecchi (Head of IT at Montevideo) in his article “Towards a digital public infrastructure: Why do governments have the responsibility to go open?”

Open data should be seen as a component of an effort to build a public digital infrastructure, where people could, within the law, do whatever they want. Exactly as they do with roads. When you see open data in this light, several decisions become easier. Should we ask people for identification to give them our data? Answer: do you ask them for an identification to use the street? No, you don’t – then no, you shouldn’t

Under the same vision, the government APIs are part of an infrastructure that allows citizens to develop their lives, businesses, etc. It is possible for citizens to get value from them, just as a taxi driver obtains value from streets in good condition where she can circulate quickly and securely.

13. Open Data Policy

ogp mexicoThe only way to ensure that Open Data are not tied to the willingness of well-meaning public servants or to moments in which they are trending—which can disappear as fast as they arrived—is to secure it through set policies and an infrastructure of legal framework and budgets needed to sustain the timely and efficient release of data.

This is the conclusion reached by various governments of the world, with one of the most widespread example being the Open Data Policy implemented by the Obama administration in the United States, presented with fanfare in May 2013. Part of the rationale for the diffusion generated by that particular initiative was in its ambition and scope, as it is not the kind of thing that can be implemented from night to day: it requires a profound transformation in the way agencies and organizations work, store and publish their data, an issue that did not escaped those in charge of carrying it forward.

In our region, a good example comes from Mexico which, as co-founder of the Open Government Partnership (OGP) in 2014, took on the challenge of a National Open Data Policy as part of its National Digital Strategy and Plan of Action for OGP. This policy does not only focus on guidelines for data publication but also includes media access mechanisms, feedback, participation and reporting, building initiatives and privacy guarantees. It also creates a Council for Open Data to verify the fulfillment of these goals and to propose improvements.

14. Coordination and Collaboration in Emergencies

The role of social networks in natural disasters has been seriously questioned and discussed in recent years. While its power of disseminating images and information  to created awareness has been observed, it is also true that it is useless that millions of people join in sentiment if some of them don’t move to action. International organizations such as UNICEF Sweden, directly undertook against the false perception of “Participation” that a “like” or “share” is on social media when money is actually required to alleviate humanitarian emergencies.

However, this is not due to a technology problem, but to the use given to technology, and while cases like that question the benefits of technology, there are different and concrete ways to attend and participate in disaster situations that are productive. Known cases for effective use of technology include the use of Ushahidi and OpenStreetMap after the earthquake in Haiti, when volunteers around the world used these applications to update and manage maps to support the work done by organizations and rescuers in the field.

 

15. Civic Labs

It’s not news that the government’s relationship with hackers has been very fruitful, being perhaps one of the more efficient policies in terms of the innovating with use of technology in governments (in those willing to try this kind of collaboration, of course). Globally known cases include Code for America, New Urban Mechanics or GovLab.

logo laboratorio de datosWhile in many cases, organizations from civil society and academia were the ones to seek governments to collaborate in innovation, a recent trend shows that it’s governments that are implementing innovation labs. No need to go far to find excellent examples of this. In one embodiment, we can speak of the city of Buenos Aires, which without a physical space, visualize as a laboratory a strategic and ongoing relationship with developers and entrepreneurs in the city. Another example is the Data Lab in Mexico, where the government experiments with applications for open data and open data policies to ensure a more effective use of information. One of their “experiments” was to try residencies, whereby important minds in the field of technology are invited to work and collaborate with the innovators in the lab—in this case, they invited Perry Chen, co-founder of Kickstarter to contribute fresh points of view. 

But why should local governments implement these initiatives? As Rudi Borrmann, Director General of Information and Government in Buenos Aires said in an interview with to Puntogob:

“Cities lead open government at a scale that is more tangible and understandable. Participation, monitoring and public support make more sense. It is in cities where new, interdisciplinary teams emerge forcing the concept of open government to evolve and expand to new explorations.”

Like the first experiences with Open Data local governments seem to take the lead and have more incentives for this approach. However, it is possible, as evidenced by the Danish MindLab, for this initiatives to expand to the national level and to achieve collaboration among agencies and municipalities.


 

This series is based on the two Ebooks written by Daniel Carranza, which you can download here and here. Catch up with Part One and Part Two.

 

Logo yoGobierno