SDK Reference

This page contains the full documentation for each function of the Studio Data SDK. Use the sidebar to navigate to any function.


Sharing Functions

This section contains detailed reference documentation for the Data SDK's sharing functions.

To view each function with a brief description and an example, visit the sharing page.

get_permissions

Get permissions for a resource stored in your Studio workspace.

Permissions include "viewer" and "editor", and can be modified for individuals (via emails) or your entire organization via the set-permissions functions.

Method

data_sdk.get_permissions(
  self,
  *,
  resource_type: ResourceType,
  resource_id: UUID | str,
) -> CategorizedPermissions:
uf-data-sdk get-permissions
  --resource-type <Type of resource, map, dataset, etc.>
  --resource-id <Resource UUID>
GET https://data-api.foursquare.com/v1/permissions/map/ HTTP/1.1
GET https://data-api.foursquare.com/v1/permissions/dataset/ HTTP/1.1

Response

A list of categorized permissions for both the organization and any other users (via email).

{
    "organization": "viewer",
    "users": [
        {
            "email": "[email protected]",
            "permission": "editor"
        },
        {
            "email": "[email protected]",
            "permission": "viewer"
        }
    ]
}

Examples

data_sdk.get_permissions(
  resource_type = "map",
  resource_id = "[UUID]"
)
uf-data-sdk get-permissions
  --resource-type "map"
  --resource-id "[UUID]"
curl GET 'https://data-api.foursquare.com/v1/permissions/map/1l9e5c4e-2f3-4f24-19fb-7e7514b43c44' --header 'Authorization: Bearer <token>'

curl GET 'https://data-api.foursquare.com/v1/permissions/dataset/1l9e5c4e-2f3-4f24-19fb-7e7514b43c44' --header 'Authorization: Bearer <token>'

set_permissions

Set permissions for a resource stored in your Studio workspace.

Permissions include "viewer" and "editor", and can be modified for individuals (via emails) or your entire organization. To remove permissions, simply omit them from set_permissions.

Method

data_sdk.set_permissions(
  self,
  *,
  resource_type: ResourceType,
  resource_id: UUID | str,
  permissions: CategorizedPermissions | Dict,
) -> None:
uf-data-sdk set-permissions
  --resource-type <Type of resource, map, dataset, etc.>
  --resource-id <Resource UUID>
  --organization <viewer or editor>
  --viewer <User (email) with 'viewer' permissions>
  --editor <User (email) with 'editor' permissions>
POST https://data-api.foursquare.com/v1/permissions/ HTTP/1.1

Body

The body of the request should contain the permissions encoded as a JSON blob.

--data-raw '{
    "resourceType": "map",
    "resourceId": "1l9e5c4e-2f3-4f24-19fb-7e7514b43c44",
    "permissions": {
        "users": [
            {
                "email": "[email protected]",
                "permission": "editor"
            }, {
                "email": "[email protected]",
                "permission": "view"
            }
        ]
    }
}'

Response

If successful, you will receive a message indicating which records were created, updated, and removed.

{
    "message": "Permissions set successfully (2 created, 0 updated, 0 removed)."
}

If certain emails don't have a Studio account associated with them, they will be skipped.

{
    "message": "Permissions set successfully (4 created, 2 updated, 1 removed). Some emails were skipped because there is no account associated with them: [email protected]."
}

Collaborators can create an account by visiting https://studio.foursquare.com/. If you are looking to collaborate with coworkers, we recommend you invite them into your organization.

Examples

data_sdk.set_permissions(
    resource_type = "map",
    resource_id = "[UUID]"
    permissions = {
      "organization": "viewer",
      "users": [
        {
          "email": "[email protected]",
          "permission": "editor"
        },
        {
          "email": "[email protected]",
          "permission": "viewer
        }
      ]
    }
  )
uf-data-sdk set-permissions
  --resource-type "map"
  --resource-id "[UUID]"
  --editor [email protected]
  --editor [email protected]
  --viewer [email protected]
  --viewer [email protected]
  --editor [email protected]
curl POST 'https://data-api.foursquare.com/v1/permissions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer <token>' \
--data-raw '{
    "resourceType": "map",
    "resourceId": "1l9e5c4e-2f3-4f24-19fb-7e7514b43c44",
    "permissions": {
        "users": [
            {
                "email": "[email protected]",
                "permission": "editor"
            }, {
                "email": "[email protected]",
                "permission": "view"
            }
        ]
    }
}'

Map Functions

This section contains detailed reference documentation covering the Data SDK's map functions.

To view each function with a brief description and an example, visit the map functions documentation.


create_map

Create a map record from JSON, including the map configuration and list of associated datasets.

Method

data_sdk.create_map(
      name: Optional[str] = None,
      description:  Optional[str] = None,
      map_state: Optional[MapState] = None,
      datasets: Optional[Iterable[Union[Dataset, UUID, str0]]])
uf-data-sdk create-map \
      --name <name> \
      --description <description> \
      --map-state <path> \
      --dataset-ids <uuid1>,<uuid2>
POST https://data-api.foursquare.com/v1/maps/ HTTP/1.1

Body

The body of the request should be the JSON data for the map record you want to create. All properties are optional, and unknown properties or those which cannot be updated will be ignored. In order to refer to datasets in the map state, they must be included in the datasets list, which can be either a list of dataset UUIDs or a list of objects in the form {"id": "string"}.

Response

Updated map record

{
  "id": "string",
  "name": "string",
  "createdAt": "2020-11-03T21:27:14.000Z",
  "updatedAt": "2020-11-13T01:44:07.000Z",
  "description": "string",
  "privacy": "private",
  "permission": "editor",
  "latestState": {
    "id": "string",
    "data": MapConfig
  },
  "datasets": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-10T18:09:39.000Z",
      "updatedAt": "2020-11-10T18:09:39.000Z",
      "privacy": "private",
      "permission": "editor",
      "isValid": true
    }
  ]
}

Example

unfolded_map = data_sdk.create_map(
      name="map name",
      description="map description",
      map_state={"id": "<uuid>", "data": {...}},
      datasets=['<uuid1>', '<uuid2>'])
uf-data-sdk create-map \
      --name "map name" \
      --description "map description" \
      --map-state /path/to/map-state.json \
      --dataset-ids <uuid1>,<uuid2>
curl -X POST https://data-api.foursquare.com/v1/maps/ \
  -H 'Authorization: Bearer <token>' \
  -H 'Content-Type: application/json' \
  --data-binary '@/path/to/my_map.json'

copy_map

Creates a copy of an existing map, duplicating its layers and map state.

The user must choose whether to copy the target map's datasets or point to them as a data source.

Method

data_sdk.copy_map(
      map: Union[Map, str, UUID],
      copy_datasets: bool,
      name: Optional[str] = None
      ) -> Map:
uf-data-sdk clone-map \
      --map-id <map-id> \
      --clone-datasets/--no--clone--datasets \
      --name <name>
curl --request POST \
     --url https://data-api.foursquare.com/v1/maps/map_id/copy \
     --header 'Authorization: Bearer <Token>' \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --data '
{
  "copyDatasets": true,
  "name": "map_copy"
}
'

Body

Specify copyDatasets option boolean to toggle whether to also copy datasets during the map copy operation. You can also provide a name for the new copy of the map with the name field.

Positional Parameters

ParameterTypeDescription
mapMap, string, or UUIDThe map record to copy. Can be a Map object representing a created map or a string/UUID id pointing to an existing map.

Keyword Parameters

ParameterTypeDescription
copy_datasetsboolRequired. If true, copy all underlying datasets of the target map.
namestringThe name to give the copied map. Default: "Copy of {source_map_name}"

Response

{
  "id": "string",
  "name": "string",
  "createdAt": "2020-11-03T21:27:14.000Z",
  "updatedAt": "2020-11-13T01:44:07.000Z",
  "description": "string",
  "privacy": "private",
  "permission": "editor",
  "latestState": {
    "id": "string",
    "data": MapConfig
  },
  "datasets": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-10T18:09:39.000Z",
      "updatedAt": "2020-11-10T18:09:39.000Z",
      "privacy": "private",
      "permission": "editor",
      "isValid": true
    }
  ]
}

Examples

unfolded_map = data_sdk.copy_map(
      map = "<uuid>",
      copy_datasets = True,
      name = "My Copied Map Example")
uf-data-sdk clone-map \
      --map-id "<uuid>" \
      --clone-datasets/--no--clone--datasets \
      --name "My Cloned Map Example"
curl -X POST https://data-api.foursquare.com/v1/maps/<uuid>/clone \
    -H 'Authorization: Bearer <token>

replace_dataset

Replace a dataset on a map, updating the visualization with the data from the new dataset.

By default, this function expects a dataset with an identical schema and will error if the new dataset is not compatible with the old one. To override the error, set force = True. To use strict type checking, set strict=True

Method

data_sdk.replace_dataset(
        map: Union[Map, str, uuid.UUID],
        dataset_to_replace: Union[Dataset, str, uuid.UUID],
        dataset_to_use: Union[Dataset, str, uuid.UUID],
        force: bool = False
        strict: bool = False
        ) ‑> Map
uf-data-sdk replace-dataset \
      --map-id, \
      --dataset-to-replace-id \
      --dataset-to-use-id
      --force
      --strict
POST https://data-api.foursquare.com/v1/maps/<uuid>/datasets/replace HTTP/1.1

Positional Parameters

ParameterTypeDescription
mapMap, string, or UUIDRequired. The map record containing the dataset to replace. Can be a Map object representing a created map or a string/UUID id pointing to an existing map.
dataset_to_replaceDataset, string, or UUIDRequired. The dataset to replace. Can be a Dataset object representing a dataset or a string/UUID id pointing to an existing dataset.
dataset_to_useDataset, string, or UUIDRequired. The new dataset to use in the replace operation. Can be a Dataset object representing a dataset or a string/UUID id pointing to an existing dataset.

Keyword Parameters

ParameterTypeDescription
forceboolIf True, force the dataset replacement operation, overriding any errors caused by mismatched schemas. Default: False
strictboolIf True, use strict typechecking and throw an error if fields don't have the same type. Default: False

Response

The Map object that was operated on.

{
  "id": "string",
  "name": "string",
  "createdAt": "2020-11-03T21:27:14.000Z",
  "updatedAt": "2020-11-13T01:44:07.000Z",
  "description": "string",
  "privacy": "private",
  "permission": "editor",
  "latestState": {
    "id": "string",
    "data": MapConfig
  },
  "datasets": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-10T18:09:39.000Z",
      "updatedAt": "2020-11-10T18:09:39.000Z",
      "privacy": "private",
      "permission": "editor",
      "isValid": true
    }
  ]
}

Examples

data_sdk.replace_dataset(
      map_id = "38bbed5-eb0e-4c65-8bcc-cc173dc497qb",
      dataset_to_replace = "750dfn07-f8b9-4d37-b698-bacd1d8e6156",
      dataset_to_use =  "c9ff8f3e-8821-4k68-b7fc-94cb95fe65e2"
)
curl -X POST https://data-api.foursquare.com/v1/maps/<uuid>/datasets/replace \
    -H 'Authorization: Bearer <token>' \
curl --request POST \
     --url https://data-api.foursquare.com/v1/maps/<uuid>/datasets/replace \
     --header 'Authorization: Bearer <Token>' \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --data '
{
  "datasetToReplaceId": "dataset_1",
  "datasetToUseId": "dataset_2",
  "force": false,
  "strict": false
}
'

get_map_by_id

Get a map record by id.

unfolded_map = data_sdk.get_map_by_id(uuid: str) -> Map
uf-data-sdk get-map <uuid>
GET https://data-api.foursquare.com/v1/maps/<uuid> HTTP/1.1

Parameters

ParameterTypeDescription
idstringThe UUID of the map record to get.

Response

Map record, including the full map state and a list of associated datasets. The map state (the configuration of the map styles, layers, etc) is omitted in the sample record below due to its complexity.

{
  "id": "string",
  "name": "string",
  "createdAt": "2020-11-03T21:27:14.000Z",
  "updatedAt": "2020-11-13T01:44:07.000Z",
  "description": "string",
  "privacy": "private",
  "permission": "editor",
  "latestState": {
    ...
  },
  "datasets": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-10T18:09:39.000Z",
      "updatedAt": "2020-11-10T18:09:39.000Z",
      "privacy": "private",
      "permission": "editor",
      "isValid": true
    }
  ]
}

Example

unfolded_map = data_sdk.get_map_by_id("<uuid>")
uf-data-sdk get-map <uuid>
curl -X GET https://data-api.foursquare.com/v1/maps/<uuid> \
  -H 'Authorization: Bearer <token>'

update_map

Update a map record, including the latest state and list of associated datasets.

update_map(
      map_id: Union[Map, UUID, str],
      name: Optional[str] = None,
      description: Optional[str] = None,
      map_state: Optional[MapState] = None,
      datasets: Optional[Iterable[Union[Dataset, UUID, str]]] = None) -> Map:
uf-data-sdk update-map \
      --map-id <uuid> \
      --name <name> \
      --description <description> \
      --map-state <path> \
      --dataset-ids <uuid1>,<uuid2>
PUT https://data-api.foursquare.com/v1/maps/{id} HTTP/1.1

Parameters

ParameterTypeDescription
idstringThe UUID of the map record to update.
namestringA new name for the map.
descriptionstringA new description for the map.
map_stateMapStateThe latest MapState of the Studio map object.
datasetsDataset listA list of Dataset objects associated with the map.

Body

The body of the request should be the JSON data for the map record you want to update. All properties are optional, and unknown properties or those which cannot be manually updated will be ignored. In order to refer to datasets in the map state, they must be included in the datasets list, which can be either a list of dataset UUIDs or a list of objects in the form {"id": <uuid>}.

Response

Updated map record

{
  "id": "string",
  "name": "string",
  "createdAt": "2020-11-03T21:27:14.000Z",
  "updatedAt": "2020-11-13T01:44:07.000Z",
  "description": "string",
  "privacy": "private",
  "permission": "editor",
  "latestState": {
    "id": "string",
    "data": MapConfig
  },
  "datasets": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-10T18:09:39.000Z",
      "updatedAt": "2020-11-10T18:09:39.000Z",
      "privacy": "private",
      "permission": "editor",
      "isValid": true
    }
  ]
}

Example

data_sdk.update_map(
    map_id = map.id,
    map_state = {
        "id": map.latest_state.id,
        "data": map.latest_state.data
    }
)
uf-data-sdk update-map \
      --map-id "map-uuid" \
      --name "new name" \
      --description "new description" \
      --map-state map-state.json \
      --dataset-ids <uuid1>,<uuid2>
curl -X PUT https://data-api.foursquare.com/v1/maps/<uuid> \
    -H 'Authorization: Bearer <token>' \
    -H 'Content-Type: application/json' \
    --data-binary '@/path/to/my_map.json'

delete_map

Delete a map record by id. This will not delete datasets associated with the map.

data_sdk.delete_map(uuid: str) -> None
uf-data-sdk delete-map <uuid>
DELETE https://data-api.foursquare.com/v1/maps/<uuid> HTTP/1.1

Parameters

ParameterTypeDescription
idstringThe UUID of the map record to delete.

Response

A message indicating if deletion was successful.

{
  "message": "string"
}

Example

# List maps on account
maps = data_sdk.list_maps()
# Select map, then delete
map_to_delete = maps[0]
data_sdk.delete_map(map)
uf-data-sdk delete-map <uuid>
curl -X GET https://data-api.foursquare.com/v1/maps/<uuid> \
  -H 'Authorization: Bearer <token>'

list_maps

Get all map records for the authenticated user.

maps = data_sdk.list_maps()
uf-data-sdk list-maps
GET https://data-api.foursquare.com/v1/maps HTTP/1.1

Get all map records for the organization of the authenticated user.

org_maps = data_sdk.list_maps(organization=True)
uf-data-sdk list-maps --organization
GET https://data-api.foursquare.com/v1/maps/for-organization HTTP/1.1

Named Parameters

ParameterTypeDescription
organizationbooleanIf True, list map records for organization of authenticated user.

Response

List of map records.

{
  "items": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2020-11-03T21:27:14.000Z",
      "updatedAt": "2020-11-13T01:44:07.000Z",
      "description": "",
      "privacy": "private",
      "permission": "editor"
    }
  ]
}

For non-enterprise users, organization=True will cause the request to fail with:

403: Insufficient permission level to perform this action

Example

maps = data_sdk.list_maps()
uf-data-sdk list-maps
curl -X https://data-api.foursquare.com/v1/maps -H 'Authorization: Bearer <token>'

Data Functions

This section contains detailed reference documentation covering the Data SDK's dataset functions.

To view each function with a brief description and an example, visit the data functions documentation.


upload_file

Create a dataset from a data upload.

# upload a file
data_sdk.upload_file(
  self,
  file: Union[BinaryIO, str, Path],
  name: Optional[str] = None,
  *,
  dataset: Optional[Union[Dataset, str, UUID]] = None,
  media_type: Optional[Union[str, MediaType]] = None,
  description: Optional[str] = None,
  chunk_size: int = 256 * 1024,
  progress: bool = False,
) -> Dataset:
# upload pandas DataFrame or geopandas GeoDataFrame
data_sdk.upload_dataframe(
  self,
  df: Union["pd.DataFrame", "gpd.GeoDataFrame"],
  name: Optional[str] = None,
  index: bool = True,
  **kwargs: Any,
) -> Dataset:
uf-data-sdk upload-file --name <name> --desc <description> --media-type <media_type> <path>
POST https://data-api.foursquare.com/v1/datasets/data?name={name}&description={description}
HTTP/1.1

Parameters

HTTP API

ParameterTypeDescription
namestringName of the dataset to create.
descriptionstringOptional. Description of the dataset to create.
Headers
HeaderDescription
Content-TypeRequired. MIME type of data you are uploading, e.g. text/csv or application/json
Body

The body of the request should be the binary data you want to upload, in a format matching the supplied Content-Type.

Python

Use upload_file for uploading data files.

upload_file
Positional Arguments
ArgumentTypeDescription
filestring of a path, or file objectPath or file object to use for uploading data.
namestringOptional. Name of the dataset to create.
Keyword Arguments
ArgumentTypeDescription
datasetDataset, str, UUIDOptional. If provided, dataset whose data should be updated. Otherwise, creates a new dataset.
media_typestring or MediaTypeOptional. File type (e.g. MediaType.CSV or text/csv). By default, tries to infer media type from file name.
descriptionstringOptional. Description of the dataset to create.
chunk_sizeintOptional. Number of bytes to upload at a time. Used for progressbar. Default: 256 * 1024
progressboolOptional. When true, display a progress bar.

Example

# upload a file
data_sdk.upload_file(
    file='new_file.csv',
    name='Dataset name',
    media_type=MediaType.CSV,
    description='Dataset description')
# upload pandas or geopandas dataframe
data_sdk.upload_dataframe(
    dataframe,
    name='Dataset name',
    description='Dataset description')
uf-data-sdk upload-file \
  --name "Dataset name" \
  --desc "Dataset description" \
  --media-type text/csv \
  new_file.csv
curl -X POST https://data-api.foursquare.com/v1/datasets/data?name=My+Dataset \
-H 'Authorization: Bearer <token>' \
-H 'Content-Type: text/csv' \
--data-binary '@/path/to/my_dataset.csv'

upload_dataframe

You can upload pandas/geopandas dataframes directly to the Studio cloud, creating a new dataset.

ArgumentTypeDescription
dfpandas.DataFrame or geopandas.DataFrameEither a Pandas DataFrame or a GeoPandas GeoDataFrame to upload to Studio.
namestringName of dataset record. Required if creating a new dataset record (instead of updating an existing dataset record).
indexbooleanOptional. If True, include row names in output. Default: True

Response

Created dataset record

{
  "id": "string",
  "name": "string",
  "createdAt": "2021-02-04T00:17:38.652Z",
  "updatedAt": "2021-02-04T00:17:38.652Z",
  "description": "string",
  "isValid": true
}

Example

data_sdk.upload_dataframe(
    dataframe,
    name='Dataset name',
    description='Dataset description')

create_external_dataset

Beta feature

Create an external dataset record referencing a dataset by URL. External datasets will be loaded from source every time, and will not be stored in our system. If the URL references a cloud storage object, e.g. with the s3:// or gcs:// protocol, and that URL requires authentication, you can include a data connector id referencing a connector with appropriate privileges to read that object.

Note that this feature is in beta and may not work for all datasets.

Method

create_external_dataset(
  self,
  *,
  name: str,
  description: str | None = None,
  source: str,
  connector: DataConnector | UUID | str | None = None,
) -> Dataset:
uf-data-sdk create-external-dataset \
  --source <Source URL of the dataset> \
  --name <Name of the new dataset> \
  --description <Description of the new dataset> \
  --connector-id <Id of optional Data Connector to use>

POST https://data-api.foursquare.com/v1/datasets/ HTTP/1.1

Parameters

ParameterTypeDescription
sourcestringRequired. The source URL of the dataset.
namestringRequired. THe name of the dataset.
descriptionstringDescription for the dataset record.
connectorstringID of an (optional) associated data connector, for cloud storage URLs.

Body

External dataset parameters encoded as a JSON blob.

--data '{
    "name": "My S3 Dataset",
    "type": "externally-hosted",
    "metadata": {
      "source": "s3://my-bucket/path/to/data.parquet"
    },
  "dataConnectionId": "<SOME_ID>"
}'

Response

New dataset record.

Examples

data_sdk.create_external_dataset(
  name = "test-external-dataset",
  description = "my external dataset",
  source = "https://s3data.example.com/data-source",
  connector = "<data-connector-uuid>"
)
uf-data-sdk create-external-dataset \
  --source "test-external-dataset" \
  --name "my external dataset", \
  --description "https://s3data.example.com/data-source" \
  --connector-id "<data-connector-uuid>"

curl POST 'https://data-api.foursquare.com/catalog/v1/datasets' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer <TOKEN>' \
--data
'{
    "name": "My S3 Dataset",
    "type": "externally-hosted",
    "metadata": {
      "source": "s3://my-bucket/path/to/data.parquet"
    },
  "dataConnectionId": "<SOME_ID>"
}'

create_query_dataset

Create a dataset from a query.

create_query_dataset(
        self,
        connector: DataConnector | UUID | str,
        query: str,
        name: str,
        description: str | None = None,
) -> Dataset:
uf-data-sdk create-query-dataset --connector-id <connector uuid> --query <SQL query to use> --name <name for the new queried dataset> --description <description of the queried dataset>
POST https://data-api.foursquare.com/v1/datasets/data-query HTTP/1.1

Parameters

ParameterDescription
connectorRequired. The data connector to use, or its UUID.
queryRequired. The SQL query.
nameName of the dataset record.
descriptionDescription for the dataset record.

Response

The newly created dataset object.

Examples

create_query_dataset(query_dataset.id, "select * from table;", "query-dataset", "sample-description")
uf-data-sdk create-query-dataset --connector-id <connector uuid> --query <SQL query to use> --name <name for the new queried dataset> --description <description of the queried dataset>
POST https://data-api.foursquare.com/v1/datasets/data-query HTTP/1.1

execute_query

Execute a query against a data connector, returning a dataframe with the results of the query, or None of the output was written to a file.

 execute_query(
    self,
    connector: DataConnector | UUID | str,
    query: str,
    output_file: str | None = None,
    output_format: QueryOutputType | str | None = None,
) -> pd.DataFrame | None:
uf-data-sdk execute-query --connector-id <connector uuid> --query <SQL query to use>
POST https://data-api.foursquare.com/v1/query/gateway/data-queries HTTP/1.1

Parameters

ParameterDescription
connectorRequired. The data connector to use, or its UUID.
queryRequired. The SQL query.
output_fileThe path to write the query output to.
output_formatThe format in which to write the output.

Response

A dataframe containing the results of the query, or None if the output was written to a file.

Example

 df = data_sdk.execute_query(
    example_data_connector.id,
    "select * from table;"
)
uf-data-sdk execute-query --connector-id <connector uuid> --query <SQL query to use>
POST https://data-api.foursquare.com/v1/query/gateway/data-queries HTTP/1.1

get_dataset_by_id

Retrieve a dataset metadata record in JSON format.

data_sdk.get_dataset_by_id(dataset: Union[Dataset, str, UUID]) -> None
uf-data-sdk get-dataset --dataset-id <uuid>
GET https://data-api.foursquare.com/v1/datasets/<uuid> HTTP/1.1

Parameters

ParameterTypeDescription
datasetdataset object or stringdataset object or UUID of the dataset record to retrieve.

Response

Dataset record

{
  "id": "string",
  "name": "string",
  "createdAt": "2021-02-04T00:17:38.652Z",
  "updatedAt": "2021-02-04T00:17:38.652Z",
  "description": "string",
  "isValid": true
}

Example

# List and select dataset
datasets = data_sdk.list_datasets()
dataset = datasets[0]
# Get dataset record
data_sdk.get_dataset_by_id(dataset)
uf-data-sdk get-map <uuid>
curl -X GET https://data-api.foursquare.com/v1/datasets/<uuid> \
  -H 'Authorization: Bearer <token>'

download_dataset

Download data from a dataset record by id.

data_sdk.download_dataset(
    dataset: Union[Dataset, str, UUID],
    output_file: Optional[Union[BinaryIO, str, Path]] = None) -> Optional[bytes]
data_sdk.download_dataframe(
    dataset: Union[Dataset, str, UUID]) -> Union[pandas.DataFrame, geopandas.GeoDataFrame]
uf-data-sdk download-dataset --dataset-id <uuid> --output-file <path>
GET https://data-api.foursquare.com/v1/datasets/<uuid>/data HTTP/1.1

Parameters

download_dataset

ParameterTypeDescription
datasetdataset object or stringdataset object or UUID of the dataset record to download.
output_filestring of a path, or file objectIf provided, a path or file object to write dataset's data to. Otherwise will return a bytes object with the dataset's data.

download_dataframe

ParameterTypeDescription
datasetdataset object or stringdataset object or UUID of the dataset record to download.

Response

download_dataset

If output_file was provided, returns None and writes data to the provided file. Otherwise, returns a bytes object with the dataset's data.

download_dataframe

Returns either a pandas DataFrame or a geopandas GeoDataFrame. If the original dataset was a CSV file, a pandas DataFrame will be returned. If it was a GeoJSON file, a geopandas GeoDataFrame will be returned.

Example

datasets = data_sdk.list_datasets()
dataset = datasets[0]
# download to local file
data_sdk.download_dataset(dataset, output_file='output.csv')
# download to buffer
buffer = data_sdk.download_dataset(dataset)
# download to a dataframe
df = data_sdk.download_dataframe(dataset)
uf-data-sdk download-dataset --dataset-id <uuid> --output-file output.csv
curl -X GET https://data-api.foursquare.com/v1/datasets/<uuid> \
  -H 'Authorization: Bearer <token>'

update_dataset

Update a dataset record, including the underlying data, and metadata such as name or description.

update_dataset(
    dataset_id: Dataset | str | UUID,
    name: str | None = None,
    description: str | None = None,
    file: BinaryIO | str | Path | None = None,
    media_type: str | MediaType | None = None,
    **kwargs: Any,
) -> Dataset:
uf-data-sdk update-dataset \
    --dataset-id <id> \
    --media-type <media_type> \
     --file <path> \
    --name <name> \
    --description <description>
PUT https://data-api.foursquare.com/v1/datasets/{id}/data HTTP/1.1
PUT https://data-api.foursquare.com/v1/datasets/{id} HTTP/1.1

Parameters

ParameterTypeDescription
dataset_idstringThe UUID of the dataset record to update.
namestringA new name for the dataset.
descriptionstringA new description for the dataset.
filestring of a path, or file objectThe new data to use for the dataset.
media_typestringThe media type of the new data

Body

There are two HTTP endpoints called by the function. For the PUT /datasets/{id}/data endpoint to update the dataset data, the body of the request should be the binary data you want to upload, in a format matching the supplied Content-Type header.

For the PUT datasets/{id} endpoint to update the dataset metadata, the request should be new metadata in JSON format, with the appropriate Content-Type header.

Response

Updated dataset record

{
  "id": "string",
  "name": "string",
  "createdAt": "2021-02-04T00:17:38.652Z",
  "updatedAt": "2021-02-04T00:17:38.652Z",
  "description": "string",
  "isValid": true
}

Example

data_sdk.update_dataset(
    dataset,
    name='New name'
    description='New description'
    file='new_file.csv',
    media_type=MediaType.CSV
)
uf-data-sdk update-dataset \
    --dataset-id <id> \
    --media-type <media_type> \
     --file <path> \
    --name <name> \
    --description <description>
curl -X PUT https://data-api.foursquare.com/v1/datasets/<uuid>/data HTTP/1.1 \
    -H 'Authorization: Bearer <token>'
    -H 'Content-Type: text/csv' \
--data-binary '@/path/to/my_dataset.csv' 

curl --request PUT \
     --url https://data-api.foursquare.com/v1//v1/datasets/<uuid> \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --data '
{
  "name": "New name",
  "description": "New description"
}

delete_dataset

Delete a dataset record by id. This will also delete any data associated with the dataset.

❗️

Warning

This operation cannot be undone. If you delete a dataset currently used in one or more maps, the dataset will be removed from those maps, possibly causing them to render incorrectly.

data_sdk.delete_dataset(dataset: Union[Dataset, str, UUID]) -> None
uf-data-sdk delete-dataset --dataset-id <uuid>
DELETE https://data-api.foursquare.com/v1/datasets/<uuid> HTTP/1.1

Parameters

ParameterTypeDescription
datasetstring or dataset objectThe UUID or dataset object of the dataset to delete.

Response

A message indicating if deletion was successful.

{
  "message": "string"
}

Example

datasets = data_sdk.list_datasets()
dataset = datasets[0]
data_sdk.delete_dataset(dataset)
uf-data-sdk delete-dataset --dataset-id <uuid>
curl -X DELETE https://data-api.foursquare.com/v1/datasets/<uuid> \
  -H 'Authorization: Bearer <token>'

generate_vectortile

Create Vector Tiles by specifying a source GeoJSON (.geojson), CSV .csv or FlatGeobuf (.fgb) file, and optionally, a target dataset.

For GeoJSON and FlatGeoBuf source files geometry can be read automatically, but source latitude/longitude columns must be specified for CSV source files

Method

generate_vectortile(
  self,
  source: Dataset | str | UUID,
  *,
  target: Dataset | str | UUID | None = None,
  source_lat_column: str | None = None,
  source_lng_column: str | None = None,
  attributes: List[str] | None = None,
  exclude_all_attributes: bool | None = None,
  tile_size_kb: int | None = None,

) -> Dataset:
uf-data-sdk generate-vectortile \
  --source <Source URL of the dataset> \
  --target <Optional target dataset> \
  --source-lat-column lat
  --source-lng-column lng

POST https://data-api.foursquare.com/v1/datasets/vectortile

Parameters

ParameterTypeDescription
sourcestringRequired. The source URL of the dataset.
targetstringOptional target dataset to overwrite.
source_lat_columnstringSource lat column (CSV only).
source_lng_columnstringSource lng column (CSV only).
attributesList[string]Attributes to keep.
exclude_all_attributesboolWhether to exclude all attributes.
tile_size_kbintMaximum tile size (in kilobytes).

Body

Vector tile parameters encoded as a JSON blob.

{
  "source": "source-dataset-uuid",
  "target": "target-dataset-uuid",
  "sourceLatColumn": "lat",
  "sourceLngColumn": "lng",
  "attributes": ["foo", "bar"],
  "tileSizeKb": 2000
}

Response

New dataset record.

Examples

data_sdk.generate_vectortile(
    source="source_dataset_uuid",
    target=None,
    source_lat_column="lat",
    source_lng_column="lng",
    attributes=["foo", "bar"].
    tile_size_kb=2000
)
uf-data-sdk create-external-dataset \
  --source "source-dataset-uuid" \
  --target "optional-target-uuid" \
  --source-lat-column lat \
  --source-lng-column lng \
  -y foo -y bar \
  --tile-size-kb 2000

curl --request POST \
     --url https://data-api.foursquare.com/v1/datasets/vectortile \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --data '
{
  "source": "source-dataset-uuid",
  "target": "target-dataset-uuid",
  "sourceLatColumn": "lat",
  "sourceLngColumn": "lng",
  "attributes": ["foo", "bar"],
  "tileSizeKb": 2000
}
'

list_data_connectors

Get all data connectors for the authenticated user or organization.

list_data_connectors(
        self, *, organization: bool = False
    ) -> List[DataConnector]:
uf-data-sdk list-data-connectors
GET https://data-api.foursquare.com/v1/data-connections HTTP/1.1

Keyword Arguments

ParameterTypeDescription
organizationbooleanIf True, list data connectors for organization of authenticated user.

Response

Returns a list of data connector objects associated with the user, or if specified, the organization.

Example:

[DataConnector(id="...", name="connector", description="desc", type=DataConnectorType.POSTGRES, ...)]

Example

Find examples for both individual and organization requests below.

# List data connectors associated with user account
data_connectors = data_sdk.list_data_connectors()
# List data connectors associated with organization
data_connectors = data_sdk.list_data_connectors(organization=True)
uf-data-sdk list-data-connectors
uf-data-sdk list-data-connectors --organization
GET https://data-api.foursquare.com/v1/data-connections HTTP/1.1
GET https://data-api.foursquare.com/v1/data-connections/for-organization HTTP/1.1

list_datasets

Get all dataset records for the authenticated user.

datasets = data_sdk.list_datasets()
uf-data-sdk list-datasets
GET https://data-api.foursquare.com/v1/datasets HTTP/1.1

Get all map records for the organization of the authenticated user.

datasets = data_sdk.list_datasets(organization=True)
uf-data-sdk list-datasets --organization
GET https://data-api.foursquare.com/v1/datasets/for-organization HTTP/1.1

Keyword Parameters

ParameterTypeDescription
organizationbooleanIf True, list dataset records for organization of authenticated user.

Response

List of dataset records:

{
  "items": [
    {
      "id": "string",
      "name": "string",
      "createdAt": "2021-02-03T23:51:14.527Z",
      "updatedAt": "2021-02-03T23:51:14.527Z",
      "description": "string",
      "isValid": true
    }
  ]
}

For non-enterprise users, organization=True will cause the request to fail with:

403: Insufficient permission level to perform this action

Example

datasets = data_sdk.list_datasets()
uf-data-sdk list-datasets
curl https://data-api.foursquare.com/v1/datasets -H 'Authorization: Bearer <token>'

Hex Tile Functions

This section contains detailed reference documentation covering the Data SDK's Hex Tile functions.

To view each function with a brief description and an example, visit the Hex Tile functions documentation.


generate_hextile

Renamed from create_hextile

Data can be processed into Hex Tiles using the Studio Data SDK.

# hextile a file
  data_sdk.generate_hextile(
      source: Union[Dataset, str, UUID],
      *,
      target: Optional[Union[Dataset, str, UUID]] = None,
      source_hex_column: Optional[str] = None,
      source_time_column: Optional[str] = None,
      time_intervals: Optional[Sequence[Union[TimeInterval, str]]] = None,
      target_res_offset: Optional[int] = None,
      _tile_mode: Optional[Union[TileMode, str]] = None,
      output_columns: Optional[
          Sequence[Union[dict, HexTileOutputColumnConfig]]
      ] = None,
      _positional_indexes: Optional[bool] = None,
  ) -> Dataset
uf-data-sdk generate-hextile 
    --source str,
    --target str | None,
    --source-hex-column str | None, 
    --source-lat-column str | None, 
    --source-lng-column str | None, 
    --source-time-column: str | None,
    --time-interval: List[str],
    --output-column List[str],
    --finest-resolution int | None
POST https://data-api.foursquare.com/internal/v1/datasets/hextile HTTP/1.1

Python

You can access the Studio Data SDK with Python to process your dataset into Hex Tiles.

generate_hextile function

Use the generate_hextilefunction to create Hex Tiles from a dataset.

Positional Arguments

ArgumentTypeDescription
sourcestringRequired. Dataset record or UUID of the dataset to convert to Hex Tile.

Keyword Arguments

ArgumentTypeDescription
targetstringDataset record or UUID of an existing Hex Tile dataset to overwrite.
source_hex_columnstringName of the hex (h3) column in the source dataset. Hex column must contain hex indexes as string.
source_lat_columnstringName of the latitude column in the source dataset.
source_lng_columnstringName of the longitude column in the source dataset.
finest_resolutionintFinest resolution for the data hexes within a tile (when creating a tileset from lat/lng columns).
source_time_columnstringName of the time column in the source dataset, or null if non-temporal data.
time_intervalsstring arrayArray of time intervals to generate for temporal datasets. Accepted intervals: ["YEAR"], ["MONTH"], ["DAY"], ["HOUR"], ["MINUTE"], ["SECOND"].
output_columnsobject arrayObject array used to aggregate a new data column during Hex Tile generation.
output_columns.source_columnstringColumn name in the source dataset.
output_columns.target_columnstringColumn name in the target hex tile dataset.
output_columns.agg_methodstringMethod to aggregate the data column with when generating coarser tile resolutions. Accepted methods: "sum", "count", "min", "max", "mean", "median", "mode". Defaults to "sum" for numeric columns.
output_columns.dtypestringData type to encode the column in the Hex Tile dataset. Example values include "float32", "uint8", "int64".
target_res_offsetintOptional integer controlling the depth of the tile hierarchy.
tile_modestringExperimental. "dense", "sparse", or "auto". Defaults to "auto".
positional_indexesbooleanExperimental. Enables the positional indexes encoding feature.

HTTP API

You can access the Studio Data API through the HTTP REST API to process your dataset into Hex Tiles.

Headers

HeaderDescription
Content-TypeMust be application/json. This header is required.

Body

The body of the request should be the parameters encoded as a JSON blob.

ParameterTypeDescription
sourcestringRequired. Dataset record or UUID of the dataset to convert to Hex Tile.
targetstringDataset record or UUID of an existing Hex Tile dataset to overwrite.
sourceHexColumnstringName of the hex (h3) column in the source dataset. Hex column must contain hex indexes as string.
sourceLatColumnstringName of the latitude column in the source dataset.
sourceLngColumnstringName of the longitude column in the source dataset.
finestResolutionintFinest resolution for the data hexes within a tile (when creating a tileset from lat/lng columns)
sourceTimeColumnstringName of the time column in the source dataset, or null if non-temporal data.
timeIntervalsstring arrayArray of time intervals to generate for temporal datasets. Accepted intervals: ["YEAR"], ["MONTH"], ["DAY"], ["HOUR"], ["MINUTE"], ["SECOND"].
outputColumnsobject arrayObject array used to aggregate a new data column during Hex Tile generation.
outputColumns.sourceColumnstringColumn name in the source dataset.
outputColumns.targetColumnstringColumn name in the target hex tile dataset.
outputColumns.aggMethodstringMethod to aggregate the data column with when generating coarser tile resolutions. Accepted methods: "sum", "count", "min", "max", "mean", "median", "mode". Defaults to "sum" for numeric columns.
outputColumns.dtypestringData type to encode the column in the Hex Tile dataset. Example values include "float32", "uint8", "int64".
targetResOffsetintOptional integer controlling the depth of the tile hierarchy.
tileModestringExperimental. "dense", "sparse", or "auto". Defaults to "auto".
positionalIndexesbooleanExperimental. Enables the positional indexes encoding feature.

Response

Upon completion, you will receive a response containing the metadata of your dataset.

{
  "id": "string",
  "name": "string",
  "createdAt": "2021-02-04T00:17:38.652Z",
  "updatedAt": "2021-02-04T00:17:38.652Z",
  "description": "string",
  "isValid": true
}

Dataset Location

Once processed, your dataset will be stored on your Studio Cloud account. You may download it using the download dataset function in the Data SDK.

Check Hex Tiling Status via API

You can find the status of the Hex Tile dataset in the API.

Retrieve the dataset's metadata, then find one of three codes in the dataStatus field:

Status codeDescription
pendingThe tiling process is still running.
readyThe tiling process is complete and the Hex Tiles can be used.
errorThe tiling process has failed.

Examples

# Import the HexTileOutputColumn Model from Studio Data SDK
data_sdk.generate_hextile(
    source="0b341204-1a76-4c1e-82a1-a856f28c522e",
    source_time_column = "time",
    source_lat_column = "lat",
    source_lng_column = "lon",
    finest_resolution = 9,
    time_intervals= ["HOUR"],
    output_columns= [
      {
        "source_column": "precip_kg/m2",
        "target_column": "precip_sum",
        "agg_method": "sum"
      }
    ]
)
uf-data-sdk generate-hextile 
  --source e32f527e-0917-40aa-955f-8d55105f9673 
  --source-lat-column Latitude 
  --source-lng-column Longitude 
  --output-column '{"sourceColumn":"Magnitude", "targetColumn":"Avg Magnitude", "aggMethod":"mean"}' 
  --output-column '{"sourceColumn":"Depth", "targetColumn":"Max Depth", "aggMethod":"max"}' 
  --finest-resolution 5
curl -X POST https://data-api.foursquare.com/internal/v1/datasets/hextile \
-H 'Authorization: Bearer <token>' \
-H 'Content-Type: application/json' \
--data-raw '{
    "source": "<source_dataset_id>",
    "sourceHexColumn": "hex",
    "sourceTimeColumn": "datestr",
    "timeIntervals": ["DAY"],
    "targetResOffset": 4,
    "outputColumns": [
        {
        "sourceColumn": "metric",
        "targetColumn": "metric_sum",
        "aggMethod": "sum",
        "dtype": "uint16"
        }
    ]
}

enrich

Datasets can be enriched with Hex Tiles using the Studio Data SDK.

# enrich a dataframe or existing dataset
data_sdk.enrich(
    dataset: Union[pd.DataFrame, Dataset, UUID, str],
    source_id: UUID,
    source_column: str,
    *,
    h3_column: Optional[str] = None,
    lat_column: Optional[str] = None,
    lng_column: Optional[str] = None,
    time_column: Optional[str] = None,
) -> pd.DataFrame
POST https://data-api.foursquare.com/internal/v1/query HTTP/1.1

HTTP API

Enrichment is provided through the Query API, which can support a range of flexible queries. The following parameters describe a simple enrichment query.

Headers

HeaderDescription
Content-TypeMust be application/json. This header is required.
AcceptMay be _/_, application/json, text/csv, or application/vnd.apache.arrow.file. The response dataset will have the corresponding data format (by default, text/csv).

Body

The body of the request should be the parameters, encoded as a JSON blob.

Parameters
ParameterTypeRequiredDescription
typestringYesUse enrich to select the enrich process.
sourceIdstringYesThe UUID of the Hex Tile dataset for enrichment.
sourceColumnstring or
string array
YesThe label of the Hex Tile column for enrichment, or an array of labels for multiple columns.
targetTypestringYesMust be either "H3" or "LATLNG".
columnstringYes for type H3Column in target dataset containing H3 addresses.
latColumnstringYes for type LATLNGColumn in target dataset containing latitude values.
lngColumnstringYes for type LATLNGColumn in target dataset containing longitude values.
timeColumnstringYes for
temporal datasets.
Column in target dataset containing time values in epoch timestamp or ISO-8601 format.
timeIntervalstringNoTime interval to use for enrichment. The target time interval must be available in the Hex Tile dataset.
Accepted methods: YEAR, MONTH, DAY, and HOUR. Defaults to the finest available interval.
inputstring arrayYesArray containing a single object describing the target dataset, in the form {"type": "dataset", "uuid": <uuid>}

Python

enrich function

Positional Arguments
ArgumentTypeRequiredDescription
datasetstringYesPandas DataFrame, Dataset record, or UUID of the dataset to hextile.
source_idstringYesUUID of the Hex Tile dataset to use for enrichment.
source_columnstringYesLabel of the Hex Tile column to use for enrichment.
Keyword Arguments
ArgumentTypeRequiredDescription
h3_columnstringYes for H3 data.Column in target dataset with H3 addresses.
lat_columnstringYes for lat/lng data.Column in target dataset with latitude values.
lng_columnstringYes for lat/lng data.Column in target dataset with longitude values.
time_columnstringYes for temporal data.Column in target dataset with time values in epoch timestamp or ISO-8601 format.

Response

Upon completion, you will receive the enriched dataset in CSV, JSON, or Arrow format depending on the Accept header.

Examples

data_sdk.enrich(
    dataset="my-dataset-uuid",
    source_id="my-hex-tile-uuid",
    source_column="some_value",
    lat_column="lat",
    lng_column="lng",
    time_column="date",
)
curl -X POST https://data-api.foursquare.com/internal/v1/datasets/hextile \
-H 'Authorization: Bearer <token>' \
-H 'Content-Type: application/json' \
-H 'Accept: text/csv' \
--data-raw '{
    "type": "enrich",
    "input": [
        {
            "type": "dataset",
            "uuid": "my-target-uuid"
        }
    ],
    "sourceId": "my-hex-tile-uuid",
    "sourceColumn": "some_value",
    "timeColumn": "date",
    "targetType": "LATLNG",
    "latColumn": "lat",
    "lngColumn": "lng"
}'

extract

You may specify an area of Hex Tiles (represented by a GeoJSON geometry) to extract. Returns a geopandas H3 dataframe.

# extract hex tiles
  data_sdk.extract(
   self,
    source_id: Union[str, UUID],
    geojson: Dict,
    *,
    source_column: Optional[Union[str, List[str]]] = None,
    res: Optional[int] = None,
    h3_column: Optional[str] = None,
    time_column: Optional[str] = None,
    time_interval: Optional[Union[Dict, TimeInterval]] = None,
) -> pd.DataFrame:
POST https://data-api.foursquare.com/internal/v1/query HTTP/1.1
    Note: Extracting Hextiles makes use of Studio's Query API.
    Please contact us if you wish to use the Query API to extract Hex Tiles.

Python

Use the tile_extractfunction to extract a region of Hex Tiles from a Hex Tile dataset.

Positional Arguments

ArgumentTypeDescription
source_idstringRequired. Dataset UUID of the dataset to convert to Hex Tile.
geojsondictRequired. A geojson geometry of the area to extract.

Keyword Arguments

ArgumentTypeDescription
source_columnstringColumn in Hex Tile dataset.
resintH3 resolution of data to extract.
h3_columnstringName of the output column containing H3 indexes. Default: h3_<res>
time_columnstringName of the output column containing time indexes. Default: date
time_intervalstring arrayTime interval to extract. Accepted intervals: "YEAR", "MONTH", "DAY", "HOUR", "MINUTE", "SECOND".

Response

Upon completion, you will receive a response containing a Pandas dataframe with the extracted dataset.

# Extract Hex Tiles within a specified GeoJSON geometry
extracted_dataset = data_sdk.extract(
    source_id="<UUID>",
    geojson="
    {
      "type": "Feature",
      "geometry": {
        "type": "Polygon",
        "coordinates": [
          [
            [0, 0],
            [0, 10],
            [10, 0],
            [0, 0]
          ]
        ]
      }
    };,
    source_column="hextile_column",
    res=8,
    h3_column = "h3_8",
    time_column = "year",
    time_interval = "YEAR"
)