CityBOT slack-gis-tips

Ecco i primi step per diventare GIStipster

Con questi primi 3 step, è possibile accedere al mondo dei GIStipster. La rete che voglio creare insieme a voi degli appassionati e professionisti che operano con i GIS si concretizza in questa sorta di CRM in stile Gamification. Ho sviluppato questa piattaforma dal CMS open source WordPress e rilasciato il tutto in CC-BY-SA e codice open source.

  • perchè essere un GIStipster:
  • accedere a cityplanner-it:
  • mappa degli utenti:

Servizi slack-gis-tips


The Capital Planning Platform is a new resource for collaborative planning, powered by open data and open source technology.

The New York City Department of City Planning pioneered open data with Bytes of the Big Apple a decade ago. With the creation of the DCP”s Capital Planning Division in 2014, we envisioned a new civic technology resource: the Capital Planning Platform – a place for planners to access the maps, data, and analytics that they need to plan for public investments in neighborhoods and collaborate with one another. The NYC Facilities Explorer (beta) is a first step in building this vision. Over the months and years to come, we plan to add more map layers, new and improved datasets, and new analysis tools to this mapping platform to help automate a broad array of planning analyses and make the capital planning process more efficient, coordinated, and strategic across the public and private sectors in New York City.

The Capital Planning Platform complements other data and maps that DCP produces. We also encourage users to explore the following resources, among others, on DCP”s website.

  • NYC Census FactFinder – An interactive tool for creating demographic, social, economic, and housing profiles for neighborhoods and user-defined groupings of Census tracts.
  • PLUTO and MapPLUTO – Extensive land use and geographic data at the tax lot level in multiple formats.
  • Zoning and Land use Application (ZoLA) – ZoLA provides a simple way to research zoning regulations in New York City.
  • Waterfront Access Map – This interactive map identifies and provides information about New York City’s inventory of publicly-accessible waterfront spaces.
  • Community Portal – The DCP Community Portal offers resources on a variety of topics related to land use, community planning, and demographic trends for each of New York City’s 59 Community Boards

Servizi slack-gis-tips tutorial

Introducing CARTO SalesQuest: Location-Based Sales Analytics

We’ve talked a lot on this blog about how different business functions use Location Intelligence in very different ways to solve very different problems.

This has led customers to ask us if we would ever build specific solutions for use cases. Today, with years of experience and thousands of customer applications deployed, we’re excited to announce our first solution built on the CARTO platform: CARTO SalesQuest.

CARTO SalesQuest is a location-based Sales Analytics solution that applies spatial analysis and location data streams to your company’s sales CRM data, boosting your team’s sales performance.

Analyzing sales data and making strategic decisions to improve your sales team’s performance has traditionally relied on sales analytics tools that focus on when sales happen: How long is my team’s sales cycle? What is our pipeline for the next three months? What is our win rate for last fiscal quarter?

These are vitally important questions for a sales team, but in order to truly optimize your sales performance in real-time, you need the ability to ask questions about where sales happens:

  • Where are there high-value opportunities in my team’s sales territory?
  • Where should I hire, assign, and deploy additional field reps based on opportunity value?
  • Where might there be potential new customers based on open data about demographics or consumer trends?

CARTO SalesQuest puts the power of location right in your sales rep’s hands, helping them to find locations of nearby prospects, visualize their customers according to time of last touch, or even prioritize sales visits based on opportunity value.

Boosting sales performance with location

We’ve spoken with lots of sales leader about optimizing sales practices around: customer segmentationsales territory design, and territory management.

While developing CARTO SalesQuest, we asked sales leaders what was most important for them in a sales analytics solution. Here’s what they said:

  • Role-Based Access. Users can assign view access according to your organization’s internal structure to maintain workflows for existing sales territories.
  • Mobile & Desktop Ready. Access SalesQuest in the office or in the field with a responsive design that adjusts to whatever size devices you and your sales team uses.
  • CRM ready. SalesQuest is ready to plug and play with your existing CRM system.

These features are important, but we also knew that in order for sales teams to optimize in real-time, we would have to augment their CRM sales data with other location data streams.

What does this look like in the world of field sales?

A leading security company we work with was able to give their sales reps data about their opportunities that other companies weren’t able to provide. They used Open Data on crime statistics across the different cities their reps were assigned to in order to identify potential new business for alarm and security service sales.

CARTO SalesQuest

Learn more about how location-based Sales Analytics can boost sales performance.


Optimize sales visit schedules

In a recent Salesforce study, a large majority of sales representatives cited internal ineffieciencies as the cause for their team’s productivity gap. In fact, respondents admitted that on average only ⅓ of their work week is spent selling while the rest is spent on more administrative tasks.

Addressing this challenge is all about making the field experience more intuitive, making selling as easy as planning a route in Google Maps or booking a hotel in the ideal location on That’s precisely why SalesQuest is built on CARTO’s simple interactive map interface so that planning an efficient business trip is data-driven, but still simple. This allows your reps to:

  • Visualize the location of nearby prospects, customers and sleepers. Minimize travel costs using route optimization when planning visits
  • See customers according to time of last touch. Maintain customer satisfaction with balanced coverage across sales area
  • Visualize nearby renewal opportunities. Reduce churn rates with visits to customers with expiring contracts
  • Identify highest value opportunities. Prioritize sales visits by highest value customers and prospects

For one client, equipping field sales reps with SalesQuest has led to a 6% increase in ASPs recorded by sales representatives, a 9% increase in the number of clients visited per month, and a 12% decrease in travel time for sales representatives.

Identify sub-optimal sales behavior

Operations managers can also take advantage of the role-access view to analyze trends and patterns in sales behavior that could be putting their quota well out of reach.

Recently, a company’s head of sales for Europe found that the average selling price on new transactions had been decreasing significantly. The head of sales wanted to figure out how and where her sales team could change behaviors to make sure this trend didn’t continue.

The image below shows the map of CRM’s sales data, filtered to new business opportunities with an average price of $60,000 or less within the European sales region.

SalesQuest Demo

The distribution of opportunities is spread out across the continent, which doesn’t yet provide the head of sales with actionable insights on how best to change sales behavior. But filtering the sales data down to the time period when the most sales are closed, which tends to be the end of the quarter, may help identify which area is in need of help.

In the image below we see a drastic difference in amount of low-dollar new transactions during the end of the sales quarter in and around Germany.

SalesQuest Demo

Upon a closer look we can pinpoint that the most low-dollar new business transactions occur in the city of Hamburg, Germany.

SalesQuest Demo

This granular insight allowed the head of sales to begin implementing changes for this specific team on the ground, refocusing them on higher-value new opportunities, business expansion, and renewal opportunities.

Our team is ready and waiting to hear from you. What are your biggest sales optimization challenges? Where do you feel like you have sales blind spots? Reach out to our team to start a conversation!


Testi, popup e formattazione – Esercitazione 1.6


Personalizzazioni grafiche – Esercitazione 1.5


Librerie, style sheet e creazione della mappa base – Esercitazione 1.3

Servizi slack-gis-tips

Servizi Abitativi Pubblici (SAP)

Il servizio contiene il numero di domande per i servizi abitativi pubblici (SAP) che risultano in graduatoria ai sensi della L.R. n. 16/2016.


Caricamento dei GeoJson – Esercitazione 1.4

Map tutorial slack-gis-tips

Movement data in GIS #10: open tools for AIS tracks from

Movement data in GIS #10: open tools for AIS tracks from is a great source for AIS data along the US coast. Their data formats and tools though are less open. Luckily, GDAL – and therefore QGIS – can read ESRI File Geodatabases(.gdb).

They also offer a Track Builder script that creates lines out of the broadcast points. (It can also join additional information from the vessel and voyage layers.) We could reproduce the line creation step using tools such as Processing’s Point to path. But this post will show how to create PostGIS trajectories instead.

First, we have to import the points into PostGIS using either DB Manager or Processing’s Import into PostGIS tool:

Then we can create the trajectories. I’ve opted to create a materialized view:

The first part of the query creates a temporary table called ptm (short for PointM). This step adds time stamp information to each point. The second part of the query then aggregates these PointMs into trajectories of type LineStringM.

 WITH ptm AS (
   SELECT b.mmsi,
       date_part('epoch', b.basedatetime)
     ) AS pt,
     b.basedatetime t
   FROM ais.broadcast b
   ORDER BY mmsi, basedatetime
 SELECT row_number() OVER () AS id,
   st_makeline( AS st_makeline,
   min(ptm.t) AS min_t,
   max(ptm.t) AS max_t
 FROM ptm
 GROUP BY ptm.mmsi

The trajectory start and end times (min_t and max_t) are optional but they can help speed up future queries.

One of the advantages of creating trajectory lines is that they render many times faster than the original points.

Of course, we end up with some artifacts at the border of the dataset extent. (Files are split by UTM zone.) Trajectories connect the last known position before the vessel left the observed area with the position of reentry. This results, for example, in vertical lines which you can see in the bottom left corner of the above screenshot.

With the trajectories ready, we can go ahead and start exploring the dataset. For example, we can visualize trajectory speed and/or create animations:

Purple trajectory segments are slow while green segments are faster

We can also perform trajectory analysis, such as trajectory generalization:

This is a first proof of concept. It would be great to have a script that automatically fetches the datasets for a specified time frame and list of UTM zones and loads them into PostGIS for further processing. In addition, it would be great to also make use of the information in the vessel and voyage tables, thus splitting up trajectories into individual voyages.

Read more:

ENG slack-gis-tips

Geospatial Data Science… Gone Wrong!!

This Halloween, the EU’s General Data Protection Regulation (GDPR) is perhaps the most frightening thing that is lurking around the corner…

GDPR highlights the fact that increased availability of data comes with a potentially devastating catch – and that data scientists, who are responsible for this information, need to take extra care.

The regulation, which comes into force in May 2018, is focused on strengthening and unifying data protection for all individuals within the EU. This means that any information (including locational information) about an EU citizen which is on your organization’s servers will need to be managed and protected carefully.

The logic behind GDPR is that data collected for various purposes can be easily fused with other open data in order to build a bigger picture of an individual. This means that information, such as purchase history records or daily commute path, can be used as a stepping stone to gathering more insights about a particular person. If for example, your innocent Google searches have ever resulted in you looking at a complete stranger’s social life then you may understand how easily this can happen.

As scientists, analysts and marketers know well, the more information you have, the easier it is to build on this information and to develop deeper insights.

That’s why it’s called ‘data science’…

It’s about forming connections.

GDPR is about protecting the anonymity of European citizens in an age when these citizens are sharing online more and more information about themselves – either knowingly or unknowingly. The purpose of GDPR is to ensure that this information is being limited on an almost ‘need-to-know’ basis, that it is being protected by data and IT custodians and that it is not being misused in any way.

And now for the terrifying thing…

Aside from reputational damages, the penalties facing organizations that breach GDPR could be financially crippling…