More

Export Overpass-turbo Query Result to OSM File

Export Overpass-turbo Query Result to OSM File


I have a tool which has input is an osm file, I ussually get the osm file by exporting map data from http://openstreetmap.org.

But now I have to get only data of roads so I use this query http://overpass-turbo.eu/s/2vM and it show what I need.

My problem is when I select "Export" tab and chose jOSM option for exporting to osm file, the site show me pop up :

Remote control not found. :( Make sure JOSM is already running and properly configured.

I don't know how to configure JOSM properly. ( Very sorry because my english skill is not good enough)


In JOSM, click on the icon with the switches, then the remote icon on the left (above WMS/TMS)

There you can activate the remote control checkbox.

Your query is missing meta data, which is mandatory for JOSM. Auto-repair will insert.

Alternatively, change your Overpass query to.

Then you can click on theDatatab on the right, select all, copy to clipboard and load it in an empty text file.


Oracle ® Solaris Cluster Geographic Edition Data Replication Guide for ZFS Snapshots

The following table lists the Status and Status Message values that are returned by the clresource status command when the State of the Oracle Solaris ZFS snapshot replication status resource is not Offline.

Examine the resource log file, console and syslog messages, and the trace log for more information about the failures. The ZFS commands executed by the replication module can be seen in the resource log and the trace log. For more information about such failure messages from ZFS commands, see Chapter 8, Working With Oracle Solaris ZFS Snapshots and Clones in Managing ZFS File Systems in Oracle Solaris 11.3 and the zfs(1M) man page.

Typically, the ZFS command failures might occur due to the absence of the required ZFS dataset permissions for the replication user. For information about the required ZFS permissions, see Prerequisites for Configuring Remote Replication Using Oracle Solaris ZFS Snapshot.

For more information about the clresource command, see the clresource(1CL) man page.


Claims ( 14 )

Application Number Priority Date Filing Date Title
US87954107P true 2007-01-10 2007-01-10
US87959807P true 2007-01-10 2007-01-10
US87954307P true 2007-01-10 2007-01-10
US87959307P true 2007-01-10 2007-01-10
PCT/EP2008/000187 WO2008083984A1 ( en ) 2007-01-10 2008-01-09 A navigation device and method for improving a time to identify a location of the navigation device

Author summary

Podoconiosis is one of the major causes of tropical lymphedema and results in massive swelling of the lower limbs. It is caused by exposure to mineral particle-induced inflammation among genetically susceptible individuals. People affected by the disease often suffer both physical and psychological comorbidities, which can include painful swelling, distress, depression, stigma and discrimination. In spite of its presence among some African countries, its geographical distribution and burden in Africa are uncertain. We applied statistical modelling to the most comprehensive database compiled to date to predict the environmental suitability of podoconiosis in the African continent. By combining climate and environmental data (elevation, annual precipitation, land surface temperature, vegetation index, and soil characteristics such as clay and silt fraction) and overlaying population figures, we predicted both the environmental suitability as well as the human population at risk for podoconiosis in Africa. Environmental suitability for podoconiosis was predicted in 29 African countries. Our estimates provide key evidence that will help decision-makers to better plan more integrated intervention programmes.

Citation: Deribe K, Simpson H, Pullan RL, Bosco MJ, Wanji S, Weaver ND, et al. (2020) Predicting the environmental suitability and population at risk of podoconiosis in Africa. PLoS Negl Trop Dis 14(8): e0008616. https://doi.org/10.1371/journal.pntd.0008616

Editor: Kate Zinszer, Universite de Montreal, CANADA

Received: March 2, 2020 Accepted: July 20, 2020 Published: August 27, 2020

Copyright: © 2020 Deribe et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the manuscript and its Supporting Information files.

Funding: This work was primarily supported by a grant from the Wellcome Trust [grant number 201900/Z/16/Z] to KD as part of his International Intermediate Fellowship. SIH and NDW are funded by the Bill & Melinda Gates Foundation (grant number OPP1132415). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.


Frequently Asked Questions

The following table contains frequently asked questions related to the Bentley Map functionalities:

    Bentley Map ** Bentley PowerMap Bentley PowerMap Field Bentley Cadastre Bentley CADscript Bentley PowerDraft for Mapping

I understand that Bentley Map V8i (SELECTseries 2) and later releases are replacements for the above products but may require a new license for proper product activation and use. Is this correct?

** Including all previous Bentley Map 08.09 and 08.11 product releases, up to and including the Bentley Map V8i (SELECTseries 1) 08.11.07.1xx releases.

Yes. Beginning with the Bentley Map V8i (SELECTseries 2) 08.11.07.4xx releases, the different "Product Editions" represent new products that use new product IDs and therefore to properly activate them you must contact your sales representative to acquire new licenses.

If you already own licenses of any of the listed products you’ll be able to acquire licenses of the new products by paying the difference in SELECT fees. A simple procedure is in place to help with the transition and your sales representatives have been informed and trained on the transition.


Yorik van Havre

I am an architect by trade, and one of the main FreeCAD developers (been around since 2008 or so). In FreeCAD, I'm mostly responsible for implementing BIM-related features, but I also work on many of the other areas and workbenches. BIM stands for Building Information Modeling, and describes a family of tools to model and represent buildings digitally. This goes far further than just plans, a BIM model is a complex and faithful representation of a building, and allows to extract not only plans and sections, but a large quantity of information, such as materials, costs, possible maintenance costs over time, construction planning, etc. With a good BIM model, you can also perform a wide range of simulations, such as calculating efforts in structures, energy consumption, etc. And finally, maybe the most important point, BIM models should be highly shareable and should integrate nicely all the work of the different people working on a building project, mainly through the use of the IFC file format.

Me and others in the FreeCAD community are working to make FreeCAD a first-class BIM modeling application, that is capable to do the same job (or better, why not!) as commercial BIM software. FreeCAD is free and open-source, runs on Mac, Windows and Linux, and will stay free forever (it cannot be "closed" or "bought"). There is quite a lot done already, FreeCAD is already a very usable BIM application. But there is much more to be done, both to implement new tools and functionality, and to refine the interface to make all this more intuitive to use and user-friendly.

By sponsoring me here, you are allowing me to spend a bigger part of my working hours on FreeCAD development. At the moment I am spending roughly a quarter of my month on it. I don't want to give up working as an architect, of course, that would even be bad for FreeCAD BIM development, because the constant, real-life experience is important, but being able to work half a month on FreeCAD is my goal. Thanks a million if you are considering or already helping me to reach it!

The story of what I'm doing with FreeCAD is registered here, and also on my blog. I also post about it on twitter and other social networks, and regularly do short videos to explain some of the BIM features. If you want to keep track of latest improvements in FreeCAD (not only mine), check the new features section of the FreeCAD forum.


3 Answers 3

You could keep track of number of downloads per file simply by feeding the file through a server side script in the language of your choosing as opposed to linking directly to the file. By doing it this way every time the file is requested you can log it, and information about who is downloading it, in a database. See this answer for an example of this in PHP.

To learn the geographic location you will need to use a GeoIPService to translate their IP into a Geolocation. Then simply save that information with your download information.

I took this on with a past employer. After much discussion, we opted AGAINST .pdf as the sole electronic distribution method due to the extra load time, annoying software, and (somewhat) questionable visibility with the search engines. I know, I know, Google does search .pdfs. but it seems to grab traditional HTML sites BETTER.

So, our solution was to deliver the site in newspaper format via a custom CMS that we wrote in-house. We could then use Google Analytics to track inbound, outbound AND search related traffic plus paths through the site. A big part of the traffic we received was referrals from current readers, and there was no way to track who's passing around a .pdf via email forward. It's easy, however, to track a "send to a friend" link on a site. which is why you see it on CNN, MSNBC, etc.

An added bonus is that by doing it the way we did, we could use queries (or RSS, which we also offered) to cross-post the content back on the main site and the 25 other sites that the company ran. So, a particular letter from a high-ranking CEO could be used to populate multiple newsletters and websites with just a few clicks.

Later, to appease the print-obsessed crowd, we did begin offering a .pdf download, generated on the fly server-side. Sure, it wasn't a perfect, custom laid out graphic marvel. but it worked, was automatic, and people liked it. Load time for the .pdf was

20-30 seconds if your .pdf viewer wasn't open. Load time on the non-pdf site was about 1sec by comparison.

Administration-wise, we went from hours per newsletter down to less than an hour. An accompanying email system was setup to auto-generate a email blast with the month's articles, with just a handful of clicks. The email blast immediately ratcheted up viewership and doubled return visitors.


Advanced Data Source Management

The worst nightmare for every ArcGIS user is broken data links. ArcMap and ArcCatalog should provide robust, intelligent and automated data source management which continually maintains integrity of project’s data structure and maximally assists user in its preservation. This would bring more freedom to the work process and establish more user comfortable environment.

  • Active and permanent source data surveillance – monitoring of current projects and linked data locations throughout the work process. User is warned before the change with direct impact on project’s integrity (add, delete, move or rename). Approved changes are immediately reflected in the map document (sources are automatically updated).
  • Node-based data source manager – intuitive and interactive management of all externally referenced map data (geo, database, layout) and their relationship visualization (like Autodesk Flame, The Foundry Nuke).
  • Broken data links manager – report layers, databases and layout elements with missing data links on MXD open or LYR add. Provide possibility for automatic or manual repair.
  • Intelligent evaluation of the loss of data connection – identify whole or partial loss due to moved/renamed data sources or base folders (absolute/relative mode) or change of map document location (relative mode) when the project is opened or previewed.
  • Automatic search for moved or renamed layers or folders – finds original or similar sources in the neighborhood of missing data with identification and comparison of best matches according to pre-recorded file, geographic, database and metadata properties.
  • Automatic re-creation of entire broken data structure – recover automatically all missing sources according to pre-recorded information about map document and data sources (path, size, date, name…) or by user specified new data position.
  • Layer data source replacement assistant – to help properly update database dependent layer properties (symbology, definition query, labels and joins/relates) and provide smooth transition during the layer source exchange with emphasis on preserving as many layer settings as possible.

I'd like to see a reverse searching functionality in ArcCatalog that would work something like this:

You right click a feature class, raster, etc. in ArcCatalog and choose something like 'Search Projects that use this item'. You could enter the file location you want to search through (ie. all of the C: drive, just C:projects, etc.). You could also choose which types of files to search through (ie. Map Documents [.mxd's], ArcGIS Explorer documents [.nmf's], etc). Then a dialog box would tell you all of the projects that are referencing this item.

If no projects were found to be referencing the item, then you could feel secure in renaming it, deleting it, etc.

I'm not so sure I want to see ArcGIS bogged down still further (it's already a very heavy-weight, slow-to-start program) with active monitoring and searching for missing data sources. We draw our data sources from local drives and from the network and from ArcSDE servers. The name space is huge, and the number of files in that name space numbers in the millions (that doesn't even consider internet data sources!). Network traffic is already intense without adding additional indexing and searching background processes. And while the reverse lookup ability in the above comment would certainly be nice, I can't fathom how it could reasonably handle cases where some or all of the referencing documents have been moved or archived. Should the data be considered to be de-referenced at that point or not?

ESRI has made progress in allowing data sources to be programmatically updated via Python tools, but what frustrates me is that not all data sources in a document can be addressed this way . Namely, according to the documentation, joined tables cannot have their sources fixed using the tools . It is a very rare map in our shop that doesn't use joined tables, meaning that we still can't use automated tools to fix map documents in a batch when our SDE server moves onto a new host, or we replace our file servers. If this last gap could be crossed, I think we could make do.


Les recrutements au Cameroun sont gratuits, prenez garde si des frais vous sont demandés et n'envoyez jamais de l'argent par transfert électronique (MOMO ou OM), ni de pièces personnelles comme votre CNI - CamerSpace.com

Agency: UNOCHA

Title: National Public Management Officer – UNOCHA BUEA

Job ID: 27669

Practice Area – Job Family: Management – INFORMATION MANAGEMENT

Vacancy End Date: 26/12/2019 (Midnight New York, USA)

Duty Station: Buea, Cameroon

Education & Work Experience: F-2-Year College Degree – 4 year(s) experience, I-Master&rsquos Level Degree – 2 year(s) experience

Languages Required: English, French

Vacancy Type: FTA Local

Posting Type: External

Bureau: Africa

Contract Duration: 1 Year with possibility for extension

The United Nations Office for the Coordination of Humanitarian Affairs (UN OCHA) has established field offices in Buea and Bamenda to facilitate the coordination of humanitarian activities in the North-West and South-West regions of Cameroon.
Information management is a core component of a comprehensive support strategy for the humanitarian community. In order to meet the increased requirements for coordination support, humanitarian advocacy and information, the National Information Management Officer will support the Information Management Unit (IMU) to analyze relevant data (tabular, statistical, spatial etc.) to support an efficient and effective humanitarian response.

Duties and Responsibilities

Summary of key functions:

  1. Support in the development of spatial/geographical information products
  2. Support the development and maintenance of comprehensive operational information products
  3. Maintain a client-oriented approach that ensures that OCHA provides high-quality information management services and products to the OCHA office and to members of the humanitarian community:
  4. Facilitate knowledge building and knowledge sharing within OCHA and guidance to external stakeholders on information management focusing on achievement of the following:

Under the overall guidance of the Head of OCHA Office, the direct supervision of the Head of OCHA Sub office in Buea, and the technical supervision of the Head Information Management Unit, the national IMO will be responsible for the following duties:

Support in the development of spatial/geographical information products (i.e. maps, metadata, data dictionary, etc):

  • Collect, organize and file geographic data, map/Geographic Information Systems (GIS) production and geographic data management support. This requires a strong practical knowledge of relational database software like MS Access as well as MS Excel and experience with the pivot table function. Experience with GIS Tools like Arc-GIS, Mapinfo, QGIS, etc…
  • Develop and maintain spatial baseline and operational datasets in accordance with relevant standards and guidance, including IASC Common Operational Datasets (CODs).

Support the development and maintenance of comprehensive operational information products, Who/What/Where, monitoring matrices, operational analyses, contact lists among others:

  • Build strong relationships and maintain regular contacts with the local and international community gather information on humanitarian activities in support of the Who/ What/Where database, including frequent liaison with key stakeholders.
  • Support the development of standardized reporting formats and analysis to support operational decision making for internal and external use.
  • Collect information and assist in analysis of monitoring reports based on humanitarian indicators to provide a coherent picture of humanitarian operations.
  • On an ad-hoc basis, collect, analyze and disseminate information in cooperation with other Units within OCHA.

Maintain a client-oriented approach that ensures that OCHA provides high-quality information management services and products to the OCHA office and to members of the humanitarian community:

  • Provide liaison support with relevant partners and stakeholders to promote information sharing and coordination.
  • Provide support to the OCHA field offices, organize flow of information and assist the offices in planning information management activities.
  • Provide graphics/design support for various presentations, as well as the development of high-quality visual products (infographics, maps, tables, graphs).
  • Conduct regular trainings for sector members and work closely with the IM counterparts in partner agencies and organizations throughout the IMWG.

Facilitate knowledge building and knowledge sharing within OCHA and guidance to external stakeholders on information management focusing on achievement of the following: