pythondateout-of-memoryosmnx

osmnx out of memory if run with ox.setting.overpass_settings=f"[out:json][timeout:{ox.settings.requests_timeout}][date:\"{date}\"]"


I found that

ox.setting.overpass_settings=f"[out:json][timeout:{ox.settings.requests_timeout}][date:\"{date}\"]"

easily create an out of memory error from overpass API. here a snap of my code.

import osmnx as ox
import shapely
import datetime

date=datetime.datetime.fromisoformat("2023-12-31T10:15:23.355030")
poli="POLYGON ((12.492709372340677 41.916655635027965, 12.495766040251999 41.99143760629819, 12.76378053852936 41.984419025131984, 12.754692779733519 41.78305304410847, 12.487561402159495 41.79004473805951, 12.48453188956227 41.715248372182295, 12.21767031884794 41.72151888179064, 12.224974122071886 41.92295004575664, 12.492709372340677 41.916655635027965))"
ox.settings.requests_timeout=200
ox.settings.use_cache=False
ox.settings.log_console = True
ox.settings.overpass_settings=f"[out:json][timeout:{ox.settings.requests_timeout}][date:\"{date}\"]"
geom=shapely.wkt.loads(poli)
tags={"highway":"residential"}
features = ox.features_from_polygon(
        geom,
        tags,
    )
print(len(features))

if line 10 is commented the code runs smoothly.

is there an error in my code? Please help


Solution

  • The Overpass server is running out of memory trying to execute your query. Use the ox.settings.overpass_memory setting (see the docs) to allocate more server memory. Example:

    import osmnx as ox
    ox.settings.use_cache = False
    ox.settings.log_console = True
    place = "Piedmont, California, USA"
    tags = {"highway": "residential"}
    
    # this works fine
    features = ox.features.features_from_place(place, tags)
    
    # this works fine too
    ox.settings.overpass_settings = '[out:json][timeout:{timeout}][date:"2019-10-28T19:20:00Z"]{maxsize}'
    G = ox.graph.graph_from_place(place, network_type="drive")
    
    # but if you uncomment the code line below, it causes an error
    # server remark "runtime error: Query run out of memory using about 2048 MB of RAM"
    # features = ox.features.features_from_place(place, tags)
    
    # but if you increase the server memory, it works but takes a pretty long time
    ox.settings.overpass_memory = "3G"
    features = ox.features.features_from_place(place, tags)
    

    I thought this may have been an OSMnx bug at first, but apparently not per this comment on GitHub:

    Adding [date] to a query results in a much different evaluation strategy, because the data needs to be reconstructed from previous and current object versions to match the requested point in time. Without [date], only the current object versions are considered, which cuts down the data volume significantly.

    and

    By the way, I've checked the query on another implementation with some performance improvements in place. Both date queries took a second or two with 512M maxsize.

    So, it seems that the Overpass API back-end's query evaluation strategy is currently pretty inefficient when given a [date] parameter, causing it to require significant memory. Other implementations that are performance-optimized may not require so much memory.