OWSLib 0.8.10 documentation

OWSLib
Author:Tom Kralidis
Contact:tomkralidis at gmail.com
Release:0.8.10
Date:2014-10-13

Introduction

OWSLib is a Python package for client programming with Open Geospatial Consortium (OGC) web service (hence OWS) interface standards, and their related content models.

OWSLib was buried down inside PCL, but has been brought out as a separate project in r481.

Features

Standards Support

Standard Version(s)
OGC WMS 1.1.1
OGC WFS 1.0.0, 1.1.0, 2.0.0
OGC WCS 1.0.0, 1.1.0
OGC WMC 1.1.0
OGC SOS 1.0.0, 2.0.0
OGC SensorML 1.0.1
OGC CSW 2.0.2
OGC WPS 1.0.0
OGC Filter 1.1.0
OGC OWS Common 1.0.0, 1.1.0, 2.0
NASA DIF 9.7
FGDC CSDGM 1998
ISO 19139 2007
Dublin Core 1.1
WMTS 1.0.0
WaterML 1.0, 1.1

Installation

Requirements

OWSLib requires a Python interpreter, as well as ElementTree or lxml for XML parsing.

Install

PyPI:

$ easy_install OWSLib
# pip works too
$ pip install OWSLib

Git:

$ git clone git://github.com/geopython/OWSLib.git

openSUSE:

$ zypper ar http://download.opensuse.org/repositories/Application:/Geo/openSUSE_12.1/ GEO
$ zypper refresh
$ zypper install owslib

CentOS:

$ wget -O /etc/yum.repos.d/CentOS-CBS.repo http://download.opensuse.org/repositories/Application:/Geo/CentOS_6/Application:Geo.repo
$ yum install owslib

RedHat Enterprise Linux

$ wget -O /etc/yum.repos.d/RHEL-CBS.repo http://download.opensuse.org/repositories/Application:/Geo/RHEL_6/Application:Geo.repo
$ yum install owslib

Fedora:

Note

As of Fedora 20, OWSLib is part of the Fedora core package collection

Usage

WMS

Find out what a WMS has to offer. Service metadata:

>>> from owslib.wms import WebMapService
>>> wms = WebMapService('http://wms.jpl.nasa.gov/wms.cgi', version='1.1.1')
>>> wms.identification.type
'OGC:WMS'
>>> wms.identification.title
'JPL Global Imagery Service'

Available layers:

>>> list(wms.contents)
['global_mosaic', 'global_mosaic_base', 'us_landsat_wgs84', 'srtm_mag', 'daily_terra_721', 'daily_aqua_721', 'daily_terra_ndvi', 'daily_aqua_ndvi', 'daily_terra', 'daily_aqua', 'BMNG', 'modis', 'huemapped_srtm', 'srtmplus', 'worldwind_dem', 'us_ned', 'us_elevation', 'us_colordem']

Details of a layer:

>>> wms['global_mosaic'].title
'WMS Global Mosaic, pan sharpened'
>>> wms['global_mosaic'].queryable
0
>>> wms['global_mosaic'].opaque
0
>>> wms['global_mosaic'].boundingBox
>>> wms['global_mosaic'].boundingBoxWGS84
(-180.0, -60.0, 180.0, 84.0)
>>> wms['global_mosaic'].crsOptions
['EPSG:4326', 'AUTO:42003']
>>> wms['global_mosaic'].styles
{'pseudo_bright': {'title': 'Pseudo-color image (Uses IR and Visual bands, 542 mapping), gamma 1.5'}, 'pseudo': {'title': '(default) Pseudo-color image, pan sharpened (Uses IR and Visual bands, 542 mapping), gamma 1.5'}, 'visual': {'title': 'Real-color image, pan sharpened (Uses the visual bands, 321 mapping), gamma 1.5'}, 'pseudo_low': {'title': 'Pseudo-color image, pan sharpened (Uses IR and Visual bands, 542 mapping)'}, 'visual_low': {'title': 'Real-color image, pan sharpened (Uses the visual bands, 321 mapping)'}, 'visual_bright': {'title': 'Real-color image (Uses the visual bands, 321 mapping), gamma 1.5'}}

Available methods, their URLs, and available formats:

>>> [op.name for op in wms.operations]
['GetCapabilities', 'GetMap']
>>> wms.getOperationByName('GetMap').methods
{'Get': {'url': 'http://wms.jpl.nasa.gov/wms.cgi?'}}
>>> wms.getOperationByName('GetMap').formatOptions
['image/jpeg', 'image/png', 'image/geotiff', 'image/tiff']

That’s everything needed to make a request for imagery:

>>> img = wms.getmap(   layers=['global_mosaic'],
...                     styles=['visual_bright'],
...                     srs='EPSG:4326',
...                     bbox=(-112, 36, -106, 41),
...                     size=(300, 250),
...                     format='image/jpeg',
...                     transparent=True
...                     )
>>> out = open('jpl_mosaic_visb.jpg', 'wb')
>>> out.write(img.read())
>>> out.close()

Result:

WMS GetMap generated by OWSLib

WFS

WCS

CSW

Connect to a CSW, and inspect its properties:

>>> from owslib.csw import CatalogueServiceWeb
>>> csw = CatalogueServiceWeb('http://geodiscover.cgdi.ca/wes/serviceManagerCSW/csw')
>>> csw.identification.type
'CSW'
>>> [op.name for op in csw.operations]
['GetCapabilities', 'GetRecords', 'GetRecordById', 'DescribeRecord', 'GetDomain']

Get supported resultType’s:

>>> csw.getdomain('GetRecords.resultType')
>>> csw.results
{'values': ['results', 'validate', 'hits'], 'parameter': 'GetRecords.resultType', 'type': 'csw:DomainValuesType'}
>>>

Search for bird data:

>>> from owslib.fes import PropertyIsEqualTo, PropertyIsLike, BBox
>>> birds_query = PropertyIsEqualTo('csw:AnyText', 'birds')
>>> csw.getrecords(constraints=[birds_query], maxrecords=20)
>>> csw.results
{'matches': 101, 'nextrecord': 21, 'returned': 20}
>>> for rec in csw.records:
...     print csw.records[rec].title
...
ALLSPECIES
NatureServe Canada References
Bird Studies Canada - BirdMap WMS
Parks Canada Geomatics Metadata Repository
Bird Studies Canada - BirdMap WFS
eBird Canada - Survey Locations
WHC CitizenScience WMS
Project FeederWatch - Survey Locations
North American Bird Banding and Encounter Database
Wildlife Habitat Canada CitizenScience WFS
Parks Canada Geomatics Metadata Repository
Parks Canada Geomatics Metadata Repository
Wildlife Habitat Canada CitizenScience WMS
Canadian IBA Polygon layer
Land
Wildlife Habitat Canada CitizenScience WMS
WATER
Parks Canada Geomatics Metadata Repository
Breeding Bird Survey
SCALE
>>>

Search for bird data in Canada:

>>> bbox_query = BBox([-141,42,-52,84])
>>> csw.getrecords(constraints=[birds_query, bbox_query])
>>> csw.results
{'matches': 3, 'nextrecord': 0, 'returned': 3}
>>>

Search for keywords like ‘birds’ or ‘fowl’

>>> birds_query_like = PropertyIsLike('dc:subject', '%birds%')
>>> fowl_query_like = PropertyIsLike('dc:subject', '%fowl%')
>>> csw.getrecords(constraints=[birds_query_like, fowl_query_like])
>>> csw.results
{'matches': 107, 'nextrecord': 11, 'returned': 10}
>>>

Search for a specific record:

>>> csw.getrecordbyid(id=['9250AA67-F3AC-6C12-0CB9-0662231AA181'])
>>> c.records['9250AA67-F3AC-6C12-0CB9-0662231AA181'].title
'ALLSPECIES'

Search with a CQL query

>>> csw.getrecords(cql='csw:AnyText like "%birds%"')

Transaction: insert

>>> csw.transaction(ttype='insert', typename='gmd:MD_Metadata', record=open(file.xml).read())

Transaction: update

>>> # update ALL records
>>> csw.transaction(ttype='update', typename='csw:Record', propertyname='dc:title', propertyvalue='New Title')
>>> # update records satisfying keywords filter
>>> csw.transaction(ttype='update', typename='csw:Record', propertyname='dc:title', propertyvalue='New Title', keywords=['birds','fowl'])
>>> # update records satisfying BBOX filter
>>> csw.transaction(ttype='update', typename='csw:Record', propertyname='dc:title', propertyvalue='New Title', bbox=[-141,42,-52,84])

Transaction: delete

>>> # delete ALL records
>>> csw.transaction(ttype='delete', typename='gmd:MD_Metadata')
>>> # delete records satisfying keywords filter
>>> csw.transaction(ttype='delete', typename='gmd:MD_Metadata', keywords=['birds','fowl'])
>>> # delete records satisfying BBOX filter
>>> csw.transaction(ttype='delete', typename='gmd:MD_Metadata', bbox=[-141,42,-52,84])

Harvest a resource

>>> csw.harvest('http://host/url.xml', 'http://www.isotc211.org/2005/gmd')

WMC

WPS

Inspect a remote WPS and retrieve the supported processes:

>>> from owslib.wps import WebProcessingService
>>> wps = WebProcessingService('http://cida.usgs.gov/climate/gdp/process/WebProcessingService', verbose=False, skip_caps=True)
>>> wps.getcapabilities()
>>> wps.identification.type
'WPS'
>>> wps.identification.title
'Geo Data Portal WPS Processing'
>>> wps.identification.abstract
'Geo Data Portal WPS Processing'
>>> for operation in wps.operations:
...     operation.name
...
'GetCapabilities'
'DescribeProcess'
'Execute'
>>> for process in wps.processes:
...     process.identifier, process.title
...
('gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageIntersectionAlgorithm', 'Feature Coverage WCS Intersection')
('gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm', 'Feature Coverage OPeNDAP Intersection')
('gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm', 'Feature Categorical Grid Coverage')
('gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm', 'Feature Weighted Grid Statistics')
('gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm', 'Feature Grid Statistics')
('gov.usgs.cida.gdp.wps.algorithm.PRMSParameterGeneratorAlgorithm', 'PRMS Parameter Generator')
>>>

Determine how a specific process needs to be invoked - i.e. what are its input parameters, and output result:

>>> from owslib.wps import printInputOutput
>>> process = wps.describeprocess('gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm')
>>> process.identifier
'gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm'
>>> process.title
'Feature Weighted Grid Statistics'
>>> process.abstract
'This algorithm generates area weighted statistics of a gridded dataset for a set of vector polygon features. Using the bounding-box that encloses ...
>>> for input in process.dataInputs:
...     printInputOutput(input)
...
 identifier=FEATURE_COLLECTION, title=Feature Collection, abstract=A feature collection encoded as a WFS request or one of the supported GML profiles.,...
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/2.0.0/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/2.1.1/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/2.1.2/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/2.1.2.1/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/3.0.0/base/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/3.0.1/base/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/3.1.0/base/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/3.1.1/base/feature.xsd
 Supported Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/3.2.1/base/feature.xsd
 Default Value: mimeType=text/xml, encoding=UTF-8, schema=http://schemas.opengis.net/gml/2.0.0/feature.xsd
 minOccurs=1, maxOccurs=1
 identifier=DATASET_URI, title=Dataset URI, abstract=The base data web service URI for the dataset of interest., data type=anyURI
 Allowed Value: AnyValue
 Default Value: None
 minOccurs=1, maxOccurs=1
 identifier=DATASET_ID, title=Dataset Identifier, abstract=The unique identifier for the data type or variable of interest., data type=string
 Allowed Value: AnyValue
 Default Value: None
 minOccurs=1, maxOccurs=2147483647
 identifier=REQUIRE_FULL_COVERAGE, title=Require Full Coverage, abstract=If turned on, the service will require that the dataset of interest ....
 Allowed Value: True
 Default Value: True
 minOccurs=1, maxOccurs=1
 identifier=TIME_START, title=Time Start, abstract=The date to begin analysis., data type=dateTime
 Allowed Value: AnyValue
 Default Value: None
 minOccurs=0, maxOccurs=1
 identifier=TIME_END, title=Time End, abstract=The date to end analysis., data type=dateTime
 Allowed Value: AnyValue
 Default Value: None
 minOccurs=0, maxOccurs=1
 identifier=FEATURE_ATTRIBUTE_NAME, title=Feature Attribute Name, abstract=The attribute that will be used to label column headers in processing output., ...
 Allowed Value: AnyValue
 Default Value: None
 minOccurs=1, maxOccurs=1
 identifier=DELIMITER, title=Delimiter, abstract=The delimiter that will be used to separate columns in the processing output., data type=string
 Allowed Value: COMMA
 Allowed Value: TAB
 Allowed Value: SPACE
 Default Value: COMMA
 minOccurs=1, maxOccurs=1
 identifier=STATISTICS, title=Statistics, abstract=Statistics that will be returned for each feature in the processing output., data type=string
 Allowed Value: MEAN
 Allowed Value: MINIMUM
 Allowed Value: MAXIMUM
 Allowed Value: VARIANCE
 Allowed Value: STD_DEV
 Allowed Value: SUM
 Allowed Value: COUNT
 Default Value: None
 minOccurs=1, maxOccurs=7
 identifier=GROUP_BY, title=Group By, abstract=If multiple features and statistics are selected, this will change whether the processing output ...
 Allowed Value: STATISTIC
 Allowed Value: FEATURE_ATTRIBUTE
 Default Value: None
 minOccurs=1, maxOccurs=1
 identifier=SUMMARIZE_TIMESTEP, title=Summarize Timestep, abstract=If selected, processing output will include columns with summarized statistics ...
 Allowed Value: True
 Default Value: True
 minOccurs=0, maxOccurs=1
 identifier=SUMMARIZE_FEATURE_ATTRIBUTE, title=Summarize Feature Attribute, abstract=If selected, processing output will include a final row of ...
 Allowed Value: True
 Default Value: True
 minOccurs=0, maxOccurs=1
>>> for output in process.processOutputs:
...     printInputOutput(output)
...
 identifier=OUTPUT, title=Output File, abstract=A delimited text file containing requested process output., data type=ComplexData
 Supported Value: mimeType=text/csv, encoding=UTF-8, schema=None
 Default Value: mimeType=text/csv, encoding=UTF-8, schema=None
 reference=None, mimeType=None
>>>

Submit a processing request (extraction of a climate index variable over a specific GML polygon, for a given period of time), monitor the execution until complete:

>>> from owslib.wps import GMLMultiPolygonFeatureCollection
>>> polygon = [(-102.8184, 39.5273), (-102.8184, 37.418), (-101.2363, 37.418), (-101.2363, 39.5273), (-102.8184, 39.5273)]
>>> featureCollection = GMLMultiPolygonFeatureCollection( [polygon] )
>>> processid = 'gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm'
>>> inputs = [ ("FEATURE_ATTRIBUTE_NAME","the_geom"),
...            ("DATASET_URI", "dods://cida.usgs.gov/qa/thredds/dodsC/derivatives/derivative-days_above_threshold.pr.ncml"),
...            ("DATASET_ID", "ensemble_b1_pr-days_above_threshold"),
...            ("TIME_START","2010-01-01T00:00:00.000Z"),
...            ("TIME_END","2011-01-01T00:00:00.000Z"),
...            ("REQUIRE_FULL_COVERAGE","false"),
...            ("DELIMITER","COMMA"),
...            ("STATISTICS","MEAN"),
...            ("GROUP_BY","STATISTIC"),
...            ("SUMMARIZE_TIMESTEP","false"),
...            ("SUMMARIZE_FEATURE_ATTRIBUTE","false"),
...            ("FEATURE_COLLECTION", featureCollection)
...           ]
>>> output = "OUTPUT"
>>> execution = wps.execute(processid, inputs, output = "OUTPUT")
Executing WPS request...
Execution status=ProcessStarted
>>> from owslib.wps import monitorExecution
>>> monitorExecution(execution)

Checking execution status... (location=http://cida.usgs.gov/climate/gdp/process/RetrieveResultServlet?id=6809217153012787208)
Execution status=ProcessSucceeded
Execution status: ProcessSucceeded
Output URL=http://cida.usgs.gov/climate/gdp/process/RetrieveResultServlet?id=6809217153012787208OUTPUT.3cbcd666-a912-456f-84a3-6ede450aca95
>>>

Alternatively, define the feature through an embedded query to a WFS server:

>>> from owslib.wps import WFSQuery, WFSFeatureCollection
>>> wfsUrl = "http://cida.usgs.gov/climate/gdp/proxy/http://igsarm-cida-gdp2.er.usgs.gov:8082/geoserver/wfs"
>>> query = WFSQuery("sample:CONUS_States", propertyNames=['the_geom',"STATE"], filters=["CONUS_States.508","CONUS_States.469"])
>>> featureCollection = WFSFeatureCollection(wfsUrl, query)
>>> # same process submission as above
...

You can also submit a pre-made request encoded as WPS XML:

>>> request = open('/Users/cinquini/Documents/workspace-cog/wps/tests/resources/wps_USGSExecuteRequest1.xml','r').read()
>>> execution = wps.execute(None, [], request=request)
Executing WPS request...
Execution status=ProcessStarted
>>> monitorExecution(execution)

Checking execution status... (location=http://cida.usgs.gov/climate/gdp/process/RetrieveResultServlet?id=5103866488472745994)
Execution status=ProcessSucceeded
Execution status: ProcessSucceeded
Output URL=http://cida.usgs.gov/climate/gdp/process/RetrieveResultServlet?id=5103866488472745994OUTPUT.f80e2a78-96a9-4343-9777-be60fac5b256

SOS

GetCapabilities

Imports

>>> from tests.utils import cast_tuple_int_list, resource_file
>>> from owslib.sos import SensorObservationService
>>> from owslib.fes import FilterCapabilities
>>> from owslib.ows import OperationsMetadata
>>> from owslib.crs import Crs
>>> from datetime import datetime
>>> from operator import itemgetter

Initialize ncSOS

>>> xml = open(resource_file('sos_ncSOS_getcapabilities.xml'), 'r').read()
>>> ncsos = SensorObservationService(None, xml=xml)

Initialize 52N

>>> xml = open(resource_file('sos_52n_getcapabilities.xml'), 'r').read()
>>> f2n = SensorObservationService(None, xml=xml)

Initialize NDBC

>>> xml = open(resource_file('sos_ndbc_getcapabilities.xml'), 'r').read()
>>> ndbc = SensorObservationService(None, xml=xml)

ServiceIdentification

>>> id = ndbc.identification
>>> id.service
'OGC:SOS'
>>> id.version
'1.0.0'
>>> id.title
'National Data Buoy Center SOS'
>>> id.abstract
'National Data Buoy Center SOS'
>>> id.keywords
['Weather', 'Ocean Currents', 'Air Temperature', 'Water Temperature', 'Conductivity', 'Salinity', 'Barometric Pressure', 'Water Level', 'Waves', 'Winds', 'NDBC']
>>> id.fees
'NONE'
>>> id.accessconstraints
'NONE'

ServiceProvider

>>> p = ndbc.provider
>>> p.name
'National Data Buoy Center'
>>> p.url
'http://sdf.ndbc.noaa.gov/'

ServiceContact

>>> sc = p.contact

Unused fields should return nothing

>>> sc.role
>>> sc.position
>>> sc.instructions
>>> sc.organization
>>> sc.fax
>>> sc.hours
>>> sc.name
'Webmaster'
>>> sc.phone
'228-688-2805'
>>> sc.address
'Bldg. 3205'
>>> sc.city
'Stennis Space Center'
>>> sc.region
'MS'
>>> sc.postcode
'39529'
>>> sc.country
'USA'
>>> sc.email
'webmaster.ndbc@noaa.gov'

OperationsMetadata

>>> o = ndbc.operations
>>> len(o)
3

Name

>>> len(o)
3

Get by name

>>> getcap = ndbc.get_operation_by_name('GetCapabilities')
>>> isinstance(getcap, OperationsMetadata)
True

Get by name (case insensitive)

>>> getcap = ndbc.get_operation_by_name('getcapabilities')
>>> isinstance(getcap, OperationsMetadata)
True

# GetCapabilities

>>> getcap.constraints
[]
>>> x = getcap.parameters
>>> x == {'Sections': {'values': ['ServiceIdentification', 'ServiceProvider', 'OperationsMetadata', 'Contents', 'All']}}
True
>>> x = sorted(getcap.methods, key=itemgetter('type'))
>>> x == [{'type': 'Get', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}, {'type' : 'Post', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}]
True

# DescribeSensor

>>> descsen = ndbc.get_operation_by_name('describesensor')
>>> descsen.constraints
[]
>>> x = descsen.parameters
>>> x == {'outputFormat': {'values': ['text/xml;subtype="sensorML/1.0.1"']}}
True
>>> x = sorted(descsen.methods, key=itemgetter('type'))
>>> x == [{'type': 'Get', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}, {'type' : 'Post', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}]
True

# GetObservation

>>> getob = ndbc.get_operation_by_name('getobservation')
>>> getob.constraints
[]
>>> x = getob.parameters
>>> x == {'observedProperty': {'values': ['air_temperature', 'air_pressure_at_sea_level', 'sea_water_electrical_conductivity', 'currents', 'sea_water_salinity', 'sea_floor_depth_below_sea_surface', 'sea_water_temperature', 'waves', 'winds']}}
True
>>> x = sorted(getob.methods, key=itemgetter('type'))
>>> x == [{'type' : 'Get', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}, {'type' : 'Post', 'url': 'http://sdf.ndbc.noaa.gov/sos/server.php', 'constraints': []}]
True

Filter_Capabilities

>>> filter = ndbc.filters
>>> isinstance(filter, FilterCapabilities)
False
Contents
>>> contents = ndbc.contents
>>> len(contents)
848
Network
>>> network = contents['network-all']
>>> network.id
'network-all'
>>> network.name
'urn:ioos:network:noaa.nws.ndbc:all'
>>> network.description
'All stations on the NDBC SOS server'
>>> srs = network.srs
>>> isinstance(srs, Crs)
True
>>> srs.getcodeurn()
'urn:ogc:def:crs:EPSG::4326'
>>> srs.getcode()
'EPSG:4326'

# (left, bottom, right, top)

>>> cast_tuple_int_list(network.bbox)
[-179, -77, 180, 80]
>>> bbsrs = network.bbox_srs
>>> isinstance(bbsrs, Crs)
True
>>> bbsrs.getcodeurn()
'urn:ogc:def:crs:EPSG::4326'
>>> bbsrs.getcode()
'EPSG:4326'
>>> bp = network.begin_position
>>> isinstance(bp, datetime)
True
>>> ep = network.end_position
>>> isinstance(ep, datetime)
True
>>> network.result_model
'om:Observation'
>>> procs = network.procedures
>>> len(procs)
847
>>> network.observed_properties
['http://mmisw.org/ont/cf/parameter/air_temperature', 'http://mmisw.org/ont/cf/parameter/air_pressure_at_sea_level', 'http://mmisw.org/ont/cf/parameter/sea_water_electrical_conductivity', 'http://mmisw.org/ont/cf/parameter/currents', 'http://mmisw.org/ont/cf/parameter/sea_water_salinity', 'http://mmisw.org/ont/cf/parameter/sea_floor_depth_below_sea_surface', 'http://mmisw.org/ont/cf/parameter/sea_water_temperature', 'http://mmisw.org/ont/cf/parameter/waves', 'http://mmisw.org/ont/cf/parameter/winds']
>>> foi = network.features_of_interest
>>> len(foi)
1082
>>> rfs = network.response_formats
>>> len(rfs)
5
>>> rfs[-1]
'application/vnd.google-earth.kml+xml'
>>> rms = network.response_modes
>>> len(rms)
1
>>> rms[0]
'inline'

Station

>>> station = contents['station-zbqn7']
>>> station.id
'station-zbqn7'
>>> station.name
'urn:ioos:station:wmo:zbqn7'
>>> station.description
"Zeke's Basin, North Carolina"
>>> srs = station.srs
>>> isinstance(srs, Crs)
True
>>> srs.getcodeurn()
'urn:ogc:def:crs:EPSG::4326'
>>> srs.getcode()
'EPSG:4326'
>>> cast_tuple_int_list(station.bbox)
[-77, 33, -77, 33]
>>> bbsrs = station.bbox_srs
>>> isinstance(bbsrs, Crs)
True
>>> bbsrs.getcodeurn()
'urn:ogc:def:crs:EPSG::4326'
>>> bbsrs.getcode()
'EPSG:4326'
>>> bp = station.begin_position
>>> isinstance(bp, datetime)
True
>>> ep = station.end_position
>>> isinstance(ep, datetime)
True
>>> station.result_model
'om:Observation'
>>> procs = station.procedures
>>> len(procs)
1
>>> procs[0]
'urn:ioos:station:wmo:zbqn7'
>>> ops = station.observed_properties
>>> len(ops)
3
>>> ops[0]
'http://mmisw.org/ont/cf/parameter/sea_water_electrical_conductivity'
>>> foi = station.features_of_interest
>>> len(foi)
1
>>> foi[0]
'urn:cgi:Feature:CGI:EarthOcean'
>>> rfs = station.response_formats
>>> len(rfs)
5
>>> rfs[0]
'text/xml;schema="ioos/0.6.1"'
>>> rm = station.response_modes
>>> len(rm)
1
>>> rm[0]
'inline'

GetObservation

Imports

>>> from tests.utils import resource_file
>>> from owslib.sos import SensorObservationService
>>> from owslib.ows import OperationsMetadata
>>> from owslib.fes import FilterCapabilities
>>> from owslib.crs import Crs
>>> from datetime import datetime

Initialize

>>> xml = open(resource_file('sos_ndbc_getcapabilities.xml'),'r').read()
>>> ndbc = SensorObservationService(None, xml=xml)

GetObservation

# Send a funky eventTime

>>> off = ndbc.offerings[1]
>>> offerings = [off.name]
>>> responseFormat = off.response_formats[0]
>>> observedProperties = [off.observed_properties[0]]
>>> #observedProperties = [ndbc.get_operation_by_name('GetObservation').parameters['observedProperty']['values'][0]]
>>> eventTime = "This is not a valid eventTime!"
>>> response = ndbc.get_observation(offerings=offerings, responseFormat=responseFormat, observedProperties=observedProperties, eventTime=eventTime)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ExceptionReport: 'This is not a valid eventTime!'

# NDBC only supports one offering and one observedProperty at a time

>>> off = ndbc.offerings[1]
>>> offerings = [off.name]
>>> responseFormat = off.response_formats[0]
>>> observedProperties = [off.observed_properties[0]]
>>> #observedProperties = [ndbc.get_operation_by_name('GetObservation').parameters['observedProperty']['values'][0]]
>>> eventTime = None
>>> response = ndbc.get_observation(offerings=offerings, responseFormat=responseFormat, observedProperties=observedProperties, eventTime=eventTime)

DescribeSensor

# Send a funky procedure

>>> procedure = "foobar"
>>> outputFormat = ndbc.get_operation_by_name('DescribeSensor').parameters['outputFormat']['values'][0]
>>> response = ndbc.describe_sensor(procedure=procedure, outputFormat=outputFormat)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ExceptionReport: 'foobar'

# Valid request

>>> procedure = ndbc.offerings[1].procedures[0]
>>> outputFormat = ndbc.get_operation_by_name('DescribeSensor').parameters['outputFormat']['values'][0]
>>> response = ndbc.describe_sensor(procedure=procedure, outputFormat=outputFormat)

SensorML

Imports

>>> from tests.utils import resource_file
>>> from owslib.swe.sensor.sml import SensorML
>>> from dateutil import parser
>>> import pytz

Initialize

>>> xml = open(resource_file('sml_ndbc_station.xml'), 'r').read()
>>> root = SensorML(xml)
>>> system = root.members[0]
>>> system.description
'Station metadata for 41012 - 40NM ENE of St Augustine, FL'

Contacts

>>> sorted(system.contacts.keys())
['urn:ogc:def:classifiers:OGC:contactType:operator', 'urn:ogc:def:classifiers:OGC:contactType:publisher']
>>> operators = system.get_contacts_by_role('urn:ogc:def:classifiers:OGC:contactType:operator')
>>> operators[0].role
'urn:ogc:def:classifiers:OGC:contactType:operator'
>>> operators[0].organization
'National Data Buoy Center'
>>> operators[0].country
'US'
>>> publishers = system.get_contacts_by_role('urn:ogc:def:classifiers:OGC:contactType:publisher')
>>> publishers[0].role
'urn:ogc:def:classifiers:OGC:contactType:publisher'
>>> publishers[0].organization
'National Data Buoy Center'
>>> publishers[0].country
'USA'
>>> publishers[0].phone
'228-688-2805'
>>> publishers[0].address
'Bldg. 3205'
>>> publishers[0].city
'Stennis Space Center'
>>> publishers[0].postcode
'39529'
>>> publishers[0].email
'webmaster.ndbc@noaa.gov'
>>> publishers[0].region
'MS'

Identification

>>> sorted(system.identifiers.keys())
['Long Name', 'Short Name', 'StationId']
>>> sid = system.get_identifiers_by_name('StationId')
>>> sid[0].name
'StationId'
>>> sid[0].definition
'urn:ioos:def:identifier:NOAA:stationID'
>>> sid[0].codeSpace
'http://sdf.ndbc.noaa.gov'
>>> sid[0].value
'urn:ioos:station:wmo:41012'

Classifiers

>>> system.classifiers.keys()
['Platform Type']
>>> classi = system.get_classifiers_by_name('Platform type')
>>> classi[0].name
'Platform Type'
>>> classi[0].definition
'urn:ioos:def:classifier:NOAA:platformType'
>>> classi[0].codeSpace
'http://sdf.ndbc.noaa.gov'
>>> classi[0].value
'MOORED BUOY'

Documents

>>> system.documentation 
[<owslib.swe.sensor.sml.Documentation ...>]
>>> doc = system.documentation[0].documents[0]
>>> doc.description
'Handbook of Automated Data Quality Control Checks and Procedures, National Data Buoy Center, August 2009'
>>> doc.format
'pdf'
>>> doc.url
'http://www.ndbc.noaa.gov/NDBCHandbookofAutomatedDataQualityControl2009.pdf'

History

>>> sorted(system.history.keys())
['deployment_start', 'deployment_stop']
>>> his = system.get_history_by_name('deployment_start')
>>> his
[<owslib.swe.sensor.sml.Event ...>]
>>> len(his)
2
>>> event = his[0]
>>> parser.parse(event.date).replace(tzinfo=pytz.utc).isoformat()
'2010-01-12T00:00:00+00:00'
>>> event.description
'Deployment start event'
>>> event.documentation[0].url
'http://sdftest.ndbc.noaa.gov/sos/server.php?service=SOS&request=DescribeSensor&version=1.0.0&outputformat=text/xml;subtype="sensorML/1.0.1"&procedure=urn:ioos:station:wmo:41012:20100112'

ISO

>>> from owslib.iso import *
>>> m=MD_Metadata(etree.parse('tests/resources/9250AA67-F3AC-6C12-0CB9-0662231AA181_iso.xml')
>>> m.identification.topiccategory
'farming'
>>>

ISO Codelists:

Imports

>>> from tests.utils import resource_file
>>> import urllib2
>>> from owslib.etree import etree
>>> from owslib.iso import CodelistCatalogue

Print testing the code lists

>>> e=etree.parse(resource_file('gmxCodelists.xml'))
>>> c=CodelistCatalogue(e)
>>> sorted(c.getcodelistdictionaries())
['CI_DateTypeCode', 'CI_OnLineFunctionCode', 'CI_PresentationFormCode', 'CI_RoleCode', 'DQ_EvaluationMethodTypeCode', 'DS_AssociationTypeCode', 'DS_InitiativeTypeCode', 'MD_CellGeometryCode', 'MD_CharacterSetCode', 'MD_ClassificationCode', 'MD_CoverageContentTypeCode', 'MD_DatatypeCode', 'MD_DimensionNameTypeCode', 'MD_GeometricObjectTypeCode', 'MD_ImagingConditionCode', 'MD_KeywordTypeCode', 'MD_MaintenanceFrequencyCode', 'MD_MediumFormatCode', 'MD_MediumNameCode', 'MD_ObligationCode', 'MD_PixelOrientationCode', 'MD_ProgressCode', 'MD_RestrictionCode', 'MD_ScopeCode', 'MD_SpatialRepresentationTypeCode', 'MD_TopicCategoryCode', 'MD_TopologyLevelCode', 'MX_ScopeCode']
>>> sorted(c.getcodedefinitionidentifiers('CI_RoleCode'))
['author', 'custodian', 'distributor', 'originator', 'owner', 'pointOfContact', 'principalInvestigator', 'processor', 'publisher', 'resourceProvider', 'user']

CRS Handling

>>> from owslib import crs
>>> c=crs.Crs('EPSG:4326')
>>> c.code
4326
>>> c=crs.Crs('urn:ogc:def:crs:EPSG::4326')
>>> c.authority
'EPSG'
>>> c.axisorder
'yx'
>>> c=crs.Crs('http://www.opengis.net/gml/epsg.xml#4326')
>>> c.code
4326
>>> c.axisorder
'yx'
>>> c=crs.Crs('urn:x-ogc:def:crs:EPSG:6.11:2192')
>>> c.axisorder
'xy'
>>> c.code
2192
>>> c.version
'6.11'
>>> c=crs.Crs('http://www.opengis.net/def/crs/EPSG/0/4326')
>>> c.authority
'EPSG'
>>> c.code
4326
>>> c.axisorder
'yx'

Dublin Core

NASA DIF

FGDC

WMTS

Imports

>>> from tests.utils import scratch_file

Find out what a WMTS has to offer. Service metadata:

>>> from owslib.wmts import WebMapTileService
>>> wmts = WebMapTileService("http://map1c.vis.earthdata.nasa.gov/wmts-geo/wmts.cgi")
>>> wmts.identification.type
'OGC WMTS'
>>> wmts.identification.version
'1.0.0'
>>> wmts.identification.title
'NASA Global Imagery Browse Services for EOSDIS'
>>> str.strip(wmts.identification.abstract)
'Near real time imagery from multiple NASA instruments'
>>> wmts.identification.keywords
['World', 'Global']

Service Provider:

>>> wmts.provider.name
'National Aeronautics and Space Administration'
>>> wmts.provider.url
'http://earthdata.nasa.gov/'

Available Layers:

>>> len(wmts.contents.keys()) > 0
True
>>> sorted(list(wmts.contents))[0]
'AIRS_CO_Total_Column_Day'

Fetch a tile (using some defaults):

>>> tile = wmts.gettile(layer='MODIS_Terra_CorrectedReflectance_TrueColor', tilematrixset='EPSG4326_250m', tilematrix='0', row=0, column=0, format="image/jpeg")
>>> out = open(scratch_file('nasa_modis_terra_truecolour.jpg'), 'wb')
>>> out.write(tile.read())
>>> out.close()

Result:

WMTS GetTile generated by OWSLib

WaterML

Using WaterML 1.1 for examples

Imports

>>> from tests.utils import resource_file
>>> from owslib.waterml.wml11 import WaterML_1_1 as wml
An example GetSites response (from cuahsi)
>>> f = open(resource_file('cuahsi_example_all_sites.xml'), 'r').read()
>>> sites = wml(f).response
Can view the queryInfo structure for information about the query
>>> sites.query_info.creation_time
datetime.datetime(2009, 6, 12, 10, 47, 54, 531250, tzinfo=tzoffset(None, -25200))
>>> sites.query_info.notes
['ALL Sites(empty request)']
>>> sites.query_info.criteria.method_called
'GetSites'
Get a list of codes for the sites returned
>>> codes = sites.site_codes
>>> sorted(codes)
[['10105900'], ['USU-LBR-Confluence'], ['USU-LBR-EFLower'], ['USU-LBR-EFRepeater'], ['USU-LBR-EFWeather'], ['USU-LBR-ExpFarm'], ['USU-LBR-Mendon'], ['USU-LBR-Paradise'], ['USU-LBR-ParadiseRepeater'], ['USU-LBR-SFLower'], ['USU-LBR-SFUpper'], ['USU-LBR-Wellsville']]
Get the names ofthe sites
>>> sorted(sites.site_names)
['East Fork Little Bear River Radio Repeater near Avon, Utah', 'East Fork Little Bear River at Paradise Canal Diversion near Avon, Utah', 'Little Bear River Upper Weather Station near Avon, Utah', 'Little Bear River at McMurdy Hollow near Paradise, Utah', 'Little Bear River at Mendon Road near Mendon, Utah', 'Little Bear River at Paradise, Utah', 'Little Bear River below Confluence of South and East Forks near Avon, Utah', 'Little Bear River near Wellsville, Utah', 'Radio Repeater near Paradise, Utah', 'South Fork Little Bear River above Davenport Creek near Avon, Utah', 'South Fork Little Bear River below Davenport Creek near Avon, Utah', 'Utah State University Experimental Farm near Wellsville, Utah']
Get a site to view it in more detail
>>> site = sites[codes[0][0]]
>>> site.geo_coords
[('-111.946402', '41.718473')]
>>> site.latitudes
['41.718473']
>>> site.longitudes
['-111.946402']
>>> info = site.site_info
>>> info.notes
[]
>>> x = info.site_properties
>>> x == {'County': 'Cache', 'PosAccuracy_m': '1', 'State': 'Utah', 'Site Comments': 'Located below county road bridge at Mendon Road crossing'}
True
>>> info.altname
>>> info.elevation
'1345'
An example GetSiteInfo response
>>> f = open(resource_file('cuahsi_example_siteinfo_multiple.xml')).read()
>>> sites = wml(f).response
>>> sites.query_info.criteria.method_called
'GetSiteInfo'
List codes and names of the sites
>>> codes = sites.site_codes
>>> sorted(codes)
[['USU-LBR-Mendon'], ['USU-LBR-Wellsville']]
>>> sorted(sites.site_names)
['Little Bear River at Mendon Road near Mendon, Utah', 'Little Bear River near Wellsville, Utah']
Get a site for a closer look
>>> site = sites[codes[1][0]]
>>> site.geo_coords
[('-111.917649', '41.643457')]
Get the (first) catalog of series for the site
>>> catalog = site[0]
Get a series from the catalog for a closer look
>>> series = catalog[3]
>>> series.properties
{}
>>> series.begin_date_time
datetime.datetime(2007, 11, 5, 14, 30)
>>> series.end_date_time
datetime.datetime(2008, 4, 5, 20, 30)
>>> series.method_id
'23'
>>> series.source_id
'2'
>>> series.value_count
'7309'
>>> series.value_type
'Field Observation'
>>> series.name
'Turbidity'
>>> series.organization
'Utah State University Utah Water Research Laboratory'
List variable names and codes
>>> sorted(site.variable_names)
['Battery voltage', 'Gage height', 'Oxygen, dissolved', 'Oxygen, dissolved percent of saturation', 'Phosphorus, total as P', 'Phosphorus, total as P, filtered', 'Solids, total Suspended', 'Specific conductance, unfiltered', 'Temperature', 'Turbidity', 'pH, unfiltered']
>>> sorted(site.variable_codes)
['USU10', 'USU13', 'USU3', 'USU32', 'USU33', 'USU34', 'USU35', 'USU36', 'USU39', 'USU4', 'USU40', 'USU41', 'USU5', 'USU6', 'USU7', 'USU8', 'USU9']
Get a variable by its code
>>> variable = site['USU7']
>>> variable.variable_name
'Turbidity'
>>> variable.properties
{}
>>> variable.speciation
'Not Applicable'
>>> variable.unit.name
'nephelometric turbidity units'
>>> variable.time_scale.unit.name
'second'
Example GetValues response
>>> f = open(resource_file('cuahsi_example_get_values.xml')).read()
>>> series = wml(f).response
>>> series.query_info.criteria.method_called
'GetValuesForASite'
List the names of the series returned (usually None)
>>> series.series_names
[None, None, None, None, None, None, None, None, None, None, None, None]
List the variables and their codes
>>> sorted(series.variable_names)
['Battery Voltage', 'Battery voltage', 'Gage height', 'Temperature', 'Turbidity']
>>> codes = series.variable_codes
>>> sorted(codes)
['SDSC45', 'USU10', 'USU11', 'USU12', 'USU13', 'USU3', 'USU4', 'USU5', 'USU6', 'USU7', 'USU8', 'USU9']
Get variables by code
>>> var = series.get_series_by_variable(var_code='USU4')
Get the first values set from the first variable retrieved from the code
>>> vals = var[0].values[0]
List (in tuple) the dates and their corresponding measurements
>>> sorted(vals.get_date_values())
[(datetime.datetime(2005, 8, 5, 0, 0), '34.53'), (datetime.datetime(2005, 8, 5, 0, 30), '37.12'), (datetime.datetime(2005, 8, 5, 1, 0), '35.97'), (datetime.datetime(2005, 8, 5, 1, 30), '35.78'), (datetime.datetime(2005, 8, 5, 2, 0), '35.68'), (datetime.datetime(2005, 8, 5, 2, 30), '36.08'), (datetime.datetime(2005, 8, 5, 3, 0), '37.8'), (datetime.datetime(2005, 8, 5, 3, 30), '37.93'), (datetime.datetime(2005, 8, 5, 4, 0), '38.88'), (datetime.datetime(2005, 8, 5, 4, 30), '37.34'), (datetime.datetime(2005, 8, 5, 5, 0), '35.15'), (datetime.datetime(2005, 8, 5, 5, 30), '35.96'), (datetime.datetime(2005, 8, 5, 6, 0), '35.62'), (datetime.datetime(2005, 8, 5, 6, 30), '34.72'), (datetime.datetime(2005, 8, 5, 7, 0), '34.7'), (datetime.datetime(2005, 8, 5, 7, 30), '33.54'), (datetime.datetime(2005, 8, 5, 8, 0), '34.98'), (datetime.datetime(2005, 8, 5, 8, 30), '31.65'), (datetime.datetime(2005, 8, 5, 9, 0), '32.49'), (datetime.datetime(2005, 8, 5, 9, 30), '32.78'), (datetime.datetime(2005, 8, 5, 10, 0), '30.58'), (datetime.datetime(2005, 8, 5, 10, 30), '32.8'), (datetime.datetime(2005, 8, 5, 11, 0), '31.83'), (datetime.datetime(2005, 8, 5, 11, 30), '30.71'), (datetime.datetime(2005, 8, 5, 12, 0), '30.82'), (datetime.datetime(2005, 8, 5, 12, 30), '29.72'), (datetime.datetime(2005, 8, 5, 13, 0), '27.05'), (datetime.datetime(2005, 8, 5, 13, 30), '25.5'), (datetime.datetime(2005, 8, 5, 14, 0), '24.69'), (datetime.datetime(2005, 8, 5, 14, 30), '26.03'), (datetime.datetime(2005, 8, 5, 15, 0), '25.55'), (datetime.datetime(2005, 8, 5, 15, 30), '25.96'), (datetime.datetime(2005, 8, 5, 16, 0), '24.72'), (datetime.datetime(2005, 8, 5, 16, 30), '23.36'), (datetime.datetime(2005, 8, 5, 17, 0), '24.21'), (datetime.datetime(2005, 8, 5, 17, 30), '25.61'), (datetime.datetime(2005, 8, 5, 18, 0), '24.73'), (datetime.datetime(2005, 8, 5, 18, 30), '25.73'), (datetime.datetime(2005, 8, 5, 19, 0), '24.76'), (datetime.datetime(2005, 8, 5, 19, 30), '24.96'), (datetime.datetime(2005, 8, 5, 20, 0), '25.69'), (datetime.datetime(2005, 8, 5, 20, 30), '27.34'), (datetime.datetime(2005, 8, 5, 21, 0), '27.14'), (datetime.datetime(2005, 8, 5, 21, 30), '27.7'), (datetime.datetime(2005, 8, 5, 22, 0), '28.88'), (datetime.datetime(2005, 8, 5, 22, 30), '30.44'), (datetime.datetime(2005, 8, 5, 23, 0), '32.14'), (datetime.datetime(2005, 8, 5, 23, 30), '34.02'), (datetime.datetime(2005, 8, 6, 0, 0), '33.61')]
Example GetVariables response
>>> f = open(resource_file('cuahsi_example_get_variables.xml')).read()
>>> varis = wml(f).response
Just a list of variables
>>> codes = varis.variable_codes
>>> sorted(codes)
['SDSC45', 'USU10', 'USU11', 'USU12', 'USU13', 'USU14', 'USU15', 'USU16', 'USU17', 'USU18', 'USU19', 'USU20', 'USU21', 'USU22', 'USU23', 'USU24', 'USU25', 'USU26', 'USU27', 'USU28', 'USU29', 'USU3', 'USU30', 'USU31', 'USU32', 'USU33', 'USU34', 'USU35', 'USU36', 'USU37', 'USU38', 'USU39', 'USU4', 'USU40', 'USU41', 'USU42', 'USU43', 'USU5', 'USU6', 'USU7', 'USU8', 'USU9']
>>> sorted(varis.variable_names)
['Barometric pressure', 'Battery Voltage', 'Battery voltage', 'Discharge', 'Gage height', 'Oxygen, dissolved', 'Oxygen, dissolved percent of saturation', 'Phosphorus, total as P', 'Phosphorus, total as P, filtered', 'Precipitation', 'Radiation, incoming shortwave', 'Relative humidity', 'Solids, total Suspended', 'Specific conductance, unfiltered', 'Temperature', 'Turbidity', 'Wind direction', 'Wind speed', 'pH, unfiltered']
>>> var = varis[codes[10]]
>>> var.variable_name
'Gage height'
>>> var.no_data_value
'-9999'
>>> var.properties
{}
>>> var.unit.name
'international foot'

Development

The OWSLib wiki is located at https://github.com/geopython/OWSLib/wiki

The OWSLib source code is available at https://github.com/geopython/OWSLib

You can find out about software metrics at the OWSLib ohloh page at http://www.ohloh.net/p/OWSLib.

Testing

Support

Mailing Lists

OWSLib provides users and developers mailing lists. Subscription options and archives are available at http://lists.osgeo.org/mailman/listinfo/owslib-users and http://lists.osgeo.org/mailman/listinfo/owslib-devel.

Submitting Questions to Community

To submit questions to a mailing list, first join the list by following the subscription procedure above. Then post questions to the list by sending an email message to either owslib-users@lists.osgeo.org or owslib-devel@lists.osgeo.org.

Searching the Archives

All Community archives are located in http://lists.osgeo.org/pipermail/owslib-users/ http://lists.osgeo.org/pipermail/owslib-devel/

Metrics

You can find out about software metrics at the OWSLib ohloh page.

IRC

As well, visit OWSLib on IRC on #geopython at freenode for realtime discussion.

Logging

OWSLib logs messages to the ‘owslib’ named python logger. You may configure your application to use the log messages like so:

import logging
owslib_log = logging.getLogger('owslib')
# Add formatting and handlers as needed
owslib_log.setLevel(logging.DEBUG)

License

Copyright (c) 2006, Ancient World Mapping Center All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  • Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
  • Neither the name of the University of North Carolina nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Credits