id,page,ref,title,content,breadcrumbs,references
authentication:authentication,authentication,authentication,Authentication and permissions,"Datasette does not require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries.
Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.",[],[]
authentication:authentication-actor,authentication,authentication-actor,Actors,"Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word ""actor"" is used to cover both of these cases.
Every request to Datasette has an associated actor value, available in the code as request.actor . This can be None for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents.
The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an ""id"" string, as demonstrated by the ""root"" actor below.
Plugins can use the actor_from_request(datasette, request) hook to implement custom logic for authenticating an actor based on the incoming HTTP request.","[""Authentication and permissions""]",[]
authentication:authentication-actor-matches-allow,authentication,authentication-actor-matches-allow,actor_matches_allow(),"Plugins that wish to implement this same ""allow"" block permissions scheme can take advantage of the datasette.utils.actor_matches_allow(actor, allow) function:
from datasette.utils import actor_matches_allow
actor_matches_allow({""id"": ""root""}, {""id"": ""*""})
# returns True
The currently authenticated actor is made available to plugins as request.actor .","[""Authentication and permissions""]",[]
authentication:authentication-ds-actor,authentication,authentication-ds-actor,The ds_actor cookie,"Datasette includes a default authentication plugin which looks for a signed ds_actor cookie containing a JSON actor dictionary. This is how the root actor mechanism works.
Authentication plugins can set signed ds_actor cookies themselves like so:
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""),
)
Note that you need to pass ""actor"" as the namespace to .sign(value, namespace=""default"") .
The shape of data encoded in the cookie is as follows:
{
""a"": {... actor ...}
}","[""Authentication and permissions""]",[]
authentication:authentication-ds-actor-expiry,authentication,authentication-ds-actor-expiry,Including an expiry time,"ds_actor cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards.
To include an expiry, add a ""e"" key to the cookie value containing a base62-encoded integer representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours:
import time
from datasette.utils import baseconv
expires_at = int(time.time()) + (24 * 60 * 60)
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign(
{
""a"": {""id"": ""cleopaws""},
""e"": baseconv.base62.encode(expires_at),
},
""actor"",
),
)
The resulting cookie will encode data that looks something like this:
{
""a"": {
""id"": ""cleopaws""
},
""e"": ""1jjSji""
}","[""Authentication and permissions"", ""The ds_actor cookie""]",[]
authentication:authentication-permissions-database,authentication,authentication-permissions-database,Controlling access to specific databases,"To limit access to a specific private.db database to just authenticated users, use the ""allow"" block like this:
{
""databases"": {
""private"": {
""allow"": {
""id"": ""*""
}
}
}
}","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]",[]
authentication:authentication-permissions-metadata,authentication,authentication-permissions-metadata,Configuring permissions in metadata.json,"You can limit who is allowed to view different parts of your Datasette instance using ""allow"" keys in your Metadata configuration.
You can control the following:
Access to the entire Datasette instance
Access to specific databases
Access to specific tables and views
Access to specific Canned queries
If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.","[""Authentication and permissions""]",[]
authentication:authentication-permissions-query,authentication,authentication-permissions-query,Controlling access to specific canned queries,"Canned queries allow you to configure named SQL queries in your metadata.json that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important.
To limit access to the add_name canned query in your dogs.db database to just the root user :
{
""databases"": {
""dogs"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true,
""allow"": {
""id"": [""root""]
}
}
}
}
}
}","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]",[]
authentication:id1,authentication,id1,Built-in permissions,"This section lists all of the permission checks that are carried out by Datasette core, along with the resource if it was passed.","[""Authentication and permissions""]",[]
authentication:logoutview,authentication,logoutview,The /-/logout page,The page at /-/logout provides the ability to log out of a ds_actor cookie authentication session.,"[""Authentication and permissions"", ""The ds_actor cookie""]",[]
authentication:permissions-debug-menu,authentication,permissions-debug-menu,debug-menu,"Controls if the various debug pages are displayed in the navigation menu.
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-permissions-debug,authentication,permissions-permissions-debug,permissions-debug,"Actor is allowed to view the /-/permissions debug page.
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-plugins,authentication,permissions-plugins,Checking permissions in plugins,"Datasette plugins can check if an actor has permission to perform an action using the datasette.permission_allowed(...) method.
Datasette core performs a number of permission checks, documented below . Plugins can implement the permission_allowed(datasette, actor, action, resource) plugin hook to participate in decisions about whether an actor should be able to perform a specified action.","[""Authentication and permissions""]",[]
authentication:permissionsdebugview,authentication,permissionsdebugview,The permissions debug tool,"The debug tool at /-/permissions is only available to the authenticated root user (or any actor granted the permissions-debug action according to a plugin).
It shows the thirty most recent permission checks that have been carried out by the Datasette instance.
This is designed to help administrators and plugin authors understand exactly how permission checks are being carried out, in order to effectively configure Datasette's permission system.","[""Authentication and permissions""]",[]
changelog:cookie-methods,changelog,cookie-methods,Cookie methods,"Plugins can now use the new response.set_cookie() method to set cookies.
A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.","[""Changelog"", ""0.44 (2020-06-11)""]",[]
changelog:foreign-key-expansions,changelog,foreign-key-expansions,Foreign key expansions,"When Datasette detects a foreign key reference it attempts to resolve a label
for that reference (automatically or using the Specifying the label column for a table metadata
option) so it can display a link to the associated row.
This expansion is now also available for JSON and CSV representations of the
table, using the new _labels=on query string option. See
Expanding foreign key references for more details.","[""Changelog"", ""0.23 (2018-06-18)""]",[]
changelog:id1,changelog,id1,Changelog,,[],[]
changelog:id155,changelog,id155,0.17 (2018-04-13),Release 0.17 to fix issues with PyPI,"[""Changelog""]",[]
changelog:id209,changelog,id209,0.11 (2017-11-14),"Added datasette publish now --force option.
This calls now with --force - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer.
Enable --cors by default when running in a container.","[""Changelog""]",[]
changelog:id212,changelog,id212,0.9 (2017-11-13),"Added --sql_time_limit_ms and --extra-options .
The serve command now accepts --sql_time_limit_ms for customizing the SQL time
limit.
The publish and package commands now accept --extra-options which can be used
to specify additional options to be passed to the datasite serve command when
it executes inside the resulting Docker containers.","[""Changelog""]",[]
changelog:id24,changelog,id24,0.60 (2022-01-13),,"[""Changelog""]",[]
changelog:id46,changelog,id46,0.51.1 (2020-10-31),Improvements to the new Binary data documentation page.,"[""Changelog""]",[]
changelog:id47,changelog,id47,0.51 (2020-10-31),"A new visual design, plugin hooks for adding navigation options, better handling of binary data, URL building utility methods and better support for running Datasette behind a proxy.","[""Changelog""]",[]
changelog:id85,changelog,id85,0.29 (2019-07-07),"ASGI, new plugin hooks, facet by date and much, much more...","[""Changelog""]",[]
changelog:id88,changelog,id88,0.27.1 (2019-05-09),"Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller.","[""Changelog""]",[]
changelog:id92,changelog,id92,0.25.2 (2018-12-16),"datasette publish heroku now uses the python-3.6.7 runtime
Added documentation on how to build the documentation
Added documentation covering our release process
Upgraded to pytest 4.0.2","[""Changelog""]",[]
changelog:signed-values-and-secrets,changelog,signed-values-and-secrets,Signed values and secrets,"Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace=""default"") and .unsign(value, namespace=""default"") .
Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret option or with a DATASETTE_SECRET environment variable. See Configuring the secret for more details.
You can also set a secret when you deploy Datasette using datasette publish or datasette package - see Using secrets with datasette publish .
Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.","[""Changelog"", ""0.44 (2020-06-11)""]",[]
changelog:v0-29-medium-changes,changelog,v0-29-medium-changes,Easier custom templates for table rows,"If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this:
{% for row in display_rows %}
{{ row[""title""] }}
{{ row[""description""] }}
Category: {{ row.display(""category_id"") }}
{% endfor %}
This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html .
See Custom templates for full details.","[""Changelog"", ""0.29 (2019-07-07)""]",[]
cli-reference:cli-datasette-get,cli-reference,cli-datasette-get,datasette --get,"The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server.
This means that all of Datasette's functionality can be accessed directly from the command-line.
For example:
$ datasette --get '/-/versions.json' | jq .
{
""python"": {
""version"": ""3.8.5"",
""full"": ""3.8.5 (default, Jul 21 2020, 10:48:26) \n[Clang 11.0.3 (clang-1103.0.32.62)]""
},
""datasette"": {
""version"": ""0.46+15.g222a84a.dirty""
},
""asgi"": ""3.0"",
""uvicorn"": ""0.11.8"",
""sqlite"": {
""version"": ""3.32.3"",
""fts_versions"": [
""FTS5"",
""FTS4"",
""FTS3""
],
""extensions"": {
""json1"": null
},
""compile_options"": [
""COMPILER=clang-11.0.3"",
""ENABLE_COLUMN_METADATA"",
""ENABLE_FTS3"",
""ENABLE_FTS3_PARENTHESIS"",
""ENABLE_FTS4"",
""ENABLE_FTS5"",
""ENABLE_GEOPOLY"",
""ENABLE_JSON1"",
""ENABLE_PREUPDATE_HOOK"",
""ENABLE_RTREE"",
""ENABLE_SESSION"",
""MAX_VARIABLE_NUMBER=250000"",
""THREADSAFE=1""
]
}
}
The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.","[""CLI reference"", ""datasette serve""]",[]
cli-reference:cli-help-help,cli-reference,cli-help-help,datasette --help,"Running datasette --help shows a list of all of the available commands.
[[[cog
help([""--help""])
]]]
Usage: datasette [OPTIONS] COMMAND [ARGS]...
Datasette is an open source multi-tool for exploring and publishing data
About Datasette: https://datasette.io/
Full documentation: https://docs.datasette.io/
Options:
--version Show the version and exit.
--help Show this message and exit.
Commands:
serve* Serve up specified SQLite database files with a web UI
inspect Generate JSON summary of provided database files
install Install plugins and packages from PyPI into the same...
package Package SQLite files into a Datasette Docker container
plugins List currently installed plugins
publish Publish specified SQLite database files to the internet along...
uninstall Uninstall plugins and Python packages from the Datasette...
[[[end]]]
Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.","[""CLI reference""]",[]
cli-reference:cli-help-inspect-help,cli-reference,cli-help-inspect-help,datasette inspect,"Outputs JSON representing introspected data about one or more SQLite database files.
If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running:
datasette inspect mydatabase.db > inspect-data.json
datasette serve -i mydatabase.db --inspect-file inspect-data.json
This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually.
[[[cog
help([""inspect"", ""--help""])
]]]
Usage: datasette inspect [OPTIONS] [FILES]...
Generate JSON summary of provided database files
This can then be passed to ""datasette --inspect-file"" to speed up count
operations against immutable database files.
Options:
--inspect-file TEXT
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-package-help,cli-reference,cli-help-package-help,datasette package,"Package SQLite files into a Datasette Docker container, see datasette package .
[[[cog
help([""package"", ""--help""])
]]]
Usage: datasette package [OPTIONS] FILES...
Package SQLite files into a Datasette Docker container
Options:
-t, --tag TEXT Name for the resulting Docker container, can
optionally use name:tag format
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g. main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--spatialite Enable SpatialLite extension
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
-p, --port INTEGER RANGE Port to run the server on, defaults to 8001
[1<=x<=65535]
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-plugins-help,cli-reference,cli-help-plugins-help,datasette plugins,"Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use.
[[[cog
help([""plugins"", ""--help""])
]]]
Usage: datasette plugins [OPTIONS]
List currently installed plugins
Options:
--all Include built-in default plugins
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
[[[end]]]
Example output:
[
{
""name"": ""datasette-geojson"",
""static"": false,
""templates"": false,
""version"": ""0.3.1"",
""hooks"": [
""register_output_renderer""
]
},
{
""name"": ""datasette-geojson-map"",
""static"": true,
""templates"": false,
""version"": ""0.4.0"",
""hooks"": [
""extra_body_script"",
""extra_css_urls"",
""extra_js_urls""
]
},
{
""name"": ""datasette-leaflet"",
""static"": true,
""templates"": false,
""version"": ""0.2.2"",
""hooks"": [
""extra_body_script"",
""extra_template_vars""
]
}
]","[""CLI reference""]",[]
cli-reference:cli-help-publish-cloudrun-help,cli-reference,cli-help-publish-cloudrun-help,datasette publish cloudrun,"See Publishing to Google Cloud Run .
[[[cog
help([""publish"", ""cloudrun"", ""--help""])
]]]
Usage: datasette publish cloudrun [OPTIONS] [FILES]...
Publish databases to Datasette running on Cloud Run
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when building
--service TEXT Cloud Run service to deploy (or over-write)
--spatialite Enable SpatialLite extension
--show-files Output the generated Dockerfile and
metadata.json
--memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi
--cpu [1|2|4] Number of vCPUs to allocate in Cloud Run
--timeout INTEGER Build timeout in seconds
--apt-get-install TEXT Additional packages to apt-get install
--max-instances INTEGER Maximum Cloud Run instances
--min-instances INTEGER Minimum Cloud Run instances
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-publish-help,cli-reference,cli-help-publish-help,datasette publish,"Shows a list of available deployment targets for publishing data with Datasette.
Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook.
[[[cog
help([""publish"", ""--help""])
]]]
Usage: datasette publish [OPTIONS] COMMAND [ARGS]...
Publish specified SQLite database files to the internet along with a
Datasette-powered interface and API
Options:
--help Show this message and exit.
Commands:
cloudrun Publish databases to Datasette running on Cloud Run
heroku Publish databases to Datasette running on Heroku
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-publish-heroku-help,cli-reference,cli-help-publish-heroku-help,datasette publish heroku,"See Publishing to Heroku .
[[[cog
help([""publish"", ""heroku"", ""--help""])
]]]
Usage: datasette publish heroku [OPTIONS] [FILES]...
Publish databases to Datasette running on Heroku
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when deploying
--tar TEXT --tar option to pass to Heroku, e.g.
--tar=/usr/local/bin/gtar
--generate-dir DIRECTORY Output generated application files and stop
without deploying
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-serve-help,cli-reference,cli-help-serve-help,datasette serve,"This command starts the Datasette web application running on your machine:
datasette serve mydatabase.db
Or since this is the default command you can run this instead:
datasette mydatabase.db
Once started you can access it at http://localhost:8001
[[[cog
help([""serve"", ""--help""])
]]]
Usage: datasette serve [OPTIONS] [FILES]...
Serve up specified SQLite database files with a web UI
Options:
-i, --immutable PATH Database files to open in immutable mode
-h, --host TEXT Host for server. Defaults to 127.0.0.1 which
means only connections from the local machine
will be allowed. Use 0.0.0.0 to listen to all
IPs and allow access from other machines.
-p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to
automatically assign an available port.
[0<=x<=65535]
--uds TEXT Bind to a Unix domain socket
--reload Automatically reload if code or metadata
change detected - useful for development
--cors Enable CORS by serving Access-Control-Allow-
Origin: *
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--inspect-file TEXT Path to JSON file created using ""datasette
inspect""
-m, --metadata FILENAME Path to JSON/YAML file containing
license/source metadata
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--memory Make /_memory database available
--config CONFIG Deprecated: set config option using
configname:value. Use --setting instead.
--setting SETTING... Setting, see
docs.datasette.io/en/stable/settings.html
--secret TEXT Secret used for signing secure values, such as
signed cookies
--root Output URL that sets a cookie authenticating
the root user
--get TEXT Run an HTTP GET request against this path,
print results and exit
--version-note TEXT Additional note to show on /-/versions
--help-settings Show available settings
--pdb Launch debugger on any errors
-o, --open Open Datasette in your web browser
--create Create database files if they do not exist
--crossdb Enable cross-database joins using the /_memory
database
--nolock Ignore locking, open locked files in read-only
mode
--ssl-keyfile TEXT SSL key file
--ssl-certfile TEXT SSL certificate file
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-serve-help-settings,cli-reference,cli-help-serve-help-settings,datasette serve --help-settings,"This command outputs all of the available Datasette settings .
These can be passed to datasette serve using datasette serve --setting name value .
[[[cog
help([""--help-settings""])
]]]
Settings:
default_page_size Default page size for the table view
(default=100)
max_returned_rows Maximum rows that can be returned from a table or
custom query (default=1000)
num_sql_threads Number of threads in the thread pool for
executing SQLite queries (default=3)
sql_time_limit_ms Time limit for a SQL query in milliseconds
(default=1000)
default_facet_size Number of values to return for requested facets
(default=30)
facet_time_limit_ms Time limit for calculating a requested facet
(default=200)
facet_suggest_time_limit_ms Time limit for calculating a suggested facet
(default=50)
allow_facet Allow users to specify columns to facet using
?_facet= parameter (default=True)
default_allow_sql Allow anyone to run arbitrary SQL queries
(default=True)
allow_download Allow users to download the original SQLite
database files (default=True)
suggest_facets Calculate and display suggested facets
(default=True)
default_cache_ttl Default HTTP cache TTL (used in Cache-Control:
max-age= header) (default=5)
cache_size_kb SQLite cache size in KB (0 == use SQLite default)
(default=0)
allow_csv_stream Allow .csv?_stream=1 to download all rows
(ignoring max_returned_rows) (default=True)
max_csv_mb Maximum size allowed for CSV export in MB - set 0
to disable this limit (default=100)
truncate_cells_html Truncate cells longer than this in HTML table
view - set 0 to disable (default=2048)
force_https_urls Force URLs in API output to always use https://
protocol (default=False)
template_debug Allow display of template debug information with
?_context=1 (default=False)
trace_debug Allow display of SQL trace debug information with
?_trace=1 (default=False)
base_url Datasette URLs should use this base path
(default=/)
[[[end]]]","[""CLI reference"", ""datasette serve""]",[]
cli-reference:cli-help-uninstall-help,cli-reference,cli-help-uninstall-help,datasette uninstall,"Uninstall one or more plugins.
[[[cog
help([""uninstall"", ""--help""])
]]]
Usage: datasette uninstall [OPTIONS] PACKAGES...
Uninstall plugins and Python packages from the Datasette environment
Options:
-y, --yes Don't ask for confirmation
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:id1,cli-reference,id1,CLI reference,"The datasette CLI tool provides a number of commands.
Running datasette without specifying a command runs the default command, datasette serve . See datasette serve for the full list of options for that command.
[[[cog
from datasette import cli
from click.testing import CliRunner
import textwrap
def help(args):
title = ""datasette "" + "" "".join(args)
cog.out(""\n::\n\n"")
result = CliRunner().invoke(cli.cli, args)
output = result.output.replace(""Usage: cli "", ""Usage: datasette "")
cog.out(textwrap.indent(output, ' '))
cog.out(""\n\n"")
]]]
[[[end]]]",[],[]
contributing:contributing-debugging,contributing,contributing-debugging,Debugging,"Any errors that occur while Datasette is running while display a stack trace on the console.
You can tell Datasette to open an interactive pdb debugger session if an error occurs using the --pdb option:
datasette --pdb fixtures.db","[""Contributing""]",[]
contributing:contributing-formatting-black,contributing,contributing-formatting-black,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout:
$ black . --check
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems:
$ black .
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.","[""Contributing"", ""Code formatting""]",[]
contributing:contributing-using-fixtures,contributing,contributing-using-fixtures,Using fixtures,"To run Datasette itself, type datasette .
You're going to need at least one SQLite database. A quick way to get started is to use the fixtures database that Datasette uses for its own tests.
You can create a copy of that database by running this command:
python tests/fixtures.py fixtures.db
Now you can run Datasette against the new fixtures database like so:
datasette fixtures.db
This will start a server at http://127.0.0.1:8001/ .
Any changes you make in the datasette/templates or datasette/static folder will be picked up immediately (though you may need to do a force-refresh in your browser to see changes to CSS or JavaScript).
If you want to change Datasette's Python code you can use the --reload option to cause Datasette to automatically reload any time the underlying code changes:
datasette --reload fixtures.db
You can also use the fixtures.py script to recreate the testing version of metadata.json used by the unit tests. To do that:
python tests/fixtures.py fixtures.db fixtures-metadata.json
Or to output the plugins used by the tests, run this:
python tests/fixtures.py fixtures.db fixtures-metadata.json fixtures-plugins
Test tables written to fixtures.db
- metadata written to fixtures-metadata.json
Wrote plugin: fixtures-plugins/register_output_renderer.py
Wrote plugin: fixtures-plugins/view_name.py
Wrote plugin: fixtures-plugins/my_plugin.py
Wrote plugin: fixtures-plugins/messages_output_renderer.py
Wrote plugin: fixtures-plugins/my_plugin_2.py
Then run Datasette like this:
datasette fixtures.db -m fixtures-metadata.json --plugins-dir=fixtures-plugins/","[""Contributing""]",[]
contributing:general-guidelines,contributing,general-guidelines,General guidelines,"main should always be releasable . Incomplete features should live in branches. This ensures that any small bug fixes can be quickly released.
The ideal commit should bundle together the implementation, unit tests and associated documentation updates. The commit message should link to an associated issue.
New plugin hooks should only be shipped if accompanied by a separate release of a non-demo plugin that uses them.","[""Contributing""]",[]
contributing:id1,contributing,id1,Contributing,"Datasette is an open source project. We welcome contributions!
This document describes how to contribute to Datasette core. You can also contribute to the wider Datasette ecosystem by creating new Plugins .",[],[]
csv_export:csv-export-url-parameters,csv_export,csv-export-url-parameters,URL parameters,"The following options can be used to customize the CSVs returned by Datasette.
?_header=off
This removes the first row of the CSV file specifying the headings - only the row data will be returned.
?_stream=on
Stream all matching records, not just the first page of results. See below.
?_dl=on
Causes Datasette to return a content-disposition: attachment; filename=""filename.csv"" header.","[""CSV export""]",[]
csv_export:streaming-all-records,csv_export,streaming-all-records,Streaming all records,"The stream all rows option is designed to be as efficient as possible -
under the hood it takes advantage of Python 3 asyncio capabilities and
Datasette's efficient pagination to stream back the full
CSV file.
Since databases can get pretty large, by default this option is capped at 100MB -
if a table returns more than 100MB of data the last line of the CSV will be a
truncation error message.
You can increase or remove this limit using the max_csv_mb config
setting. You can also disable the CSV export feature entirely using
allow_csv_stream .","[""CSV export""]",[]
custom_templates:css-classes-on-the-body,custom_templates,css-classes-on-the-body,CSS classes on the ,"Every default template includes CSS classes in the body designed to support
custom styling.
The index template (the top level page at / ) gets this:
The database template ( /dbname ) gets this:
The custom SQL template ( /dbname?sql=... ) gets this:
A canned query template ( /dbname/queryname ) gets this:
The table template ( /dbname/tablename ) gets:
The row template ( /dbname/tablename/rowid ) gets:
The db-x and table-x classes use the database or table names themselves if
they are valid CSS identifiers. If they aren't, we strip any invalid
characters out and append a 6 character md5 digest of the original name, in
order to ensure that multiple tables which resolve to the same stripped
character version still have different CSS classes.
Some examples:
""simple"" => ""simple""
""MixedCase"" => ""MixedCase""
""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6""
""_no-leading-underscores"" => ""no-leading-underscores-b921bc""
""no spaces"" => ""no-spaces-7088d7""
""-"" => ""336d5e""
""no $ characters"" => ""no--characters-59e024""
and | elements also get custom CSS classes reflecting the
database column they are representing, for example:
","[""Custom pages and templates"", ""Custom CSS and JavaScript""]",[]
custom_templates:custom-pages-404,custom_templates,custom-pages-404,Returning 404s,"To indicate that content could not be found and display the default 404 page you can use the raise_404(message) function:
{% if not rows %}
{{ raise_404(""Content not found"") }}
{% endif %}
If you call raise_404() the other content in your template will be ignored.","[""Custom pages and templates"", ""Custom pages""]",[]
custom_templates:custom-pages-headers,custom_templates,custom-pages-headers,Custom headers and status codes,"Custom pages default to being served with a content-type of text/html; charset=utf-8 and a 200 status code. You can change these by calling a custom function from within your template.
For example, to serve a custom page with a 418 I'm a teapot HTTP status code, create a file in pages/teapot.html containing the following:
{{ custom_status(418) }}
Teapot
I'm a teapot
To serve a custom HTTP header, add a custom_header(name, value) function call. For example:
{{ custom_status(418) }}
{{ custom_header(""x-teapot"", ""I am"") }}
Teapot
I'm a teapot
You can verify this is working using curl like this:
$ curl -I 'http://127.0.0.1:8001/teapot'
HTTP/1.1 418
date: Sun, 26 Apr 2020 18:38:30 GMT
server: uvicorn
x-teapot: I am
content-type: text/html; charset=utf-8","[""Custom pages and templates"", ""Custom pages""]",[]
custom_templates:custom-pages-redirects,custom_templates,custom-pages-redirects,Custom redirects,"You can use the custom_redirect(location) function to redirect users to another page, for example in a file called pages/datasette.html :
{{ custom_redirect(""https://github.com/simonw/datasette"") }}
Now requests to http://localhost:8001/datasette will result in a redirect.
These redirects are served with a 302 Found status code by default. You can send a 301 Moved Permanently code by passing 301 as the second argument to the function:
{{ custom_redirect(""https://github.com/simonw/datasette"", 301) }}","[""Custom pages and templates"", ""Custom pages""]",[]
custom_templates:customization,custom_templates,customization,Custom pages and templates,Datasette provides a number of ways of customizing the way data is displayed.,[],[]
custom_templates:customization-static-files,custom_templates,customization-static-files,Serving static files,"Datasette can serve static files for you, using the --static option.
Consider the following directory structure:
metadata.json
static-files/styles.css
static-files/app.js
You can start Datasette using --static assets:static-files/ to serve those
files from the /assets/ mount point:
$ datasette -m metadata.json --static assets:static-files/ --memory
The following URLs will now serve the content from those CSS and JS files:
http://localhost:8001/assets/styles.css
http://localhost:8001/assets/app.js
You can reference those files from metadata.json like so:
{
""extra_css_urls"": [
""/assets/styles.css""
],
""extra_js_urls"": [
""/assets/app.js""
]
}","[""Custom pages and templates"", ""Custom CSS and JavaScript""]",[]
custom_templates:id1,custom_templates,id1,Custom pages,"You can add templated pages to your Datasette instance by creating HTML files in a pages directory within your templates directory.
For example, to add a custom page that is served at http://localhost/about you would create a file in templates/pages/about.html , then start Datasette like this:
$ datasette mydb.db --template-dir=templates/
You can nest directories within pages to create a nested structure. To create a http://localhost:8001/about/map page you would create templates/pages/about/map.html .","[""Custom pages and templates""]",[]
custom_templates:publishing-static-assets,custom_templates,publishing-static-assets,Publishing static assets,"The datasette publish command can be used to publish your static assets,
using the same syntax as above:
$ datasette publish cloudrun mydb.db --static assets:static-files/
This will upload the contents of the static-files/ directory as part of the
deployment, and configure Datasette to correctly serve the assets from /assets/ .","[""Custom pages and templates"", ""Custom CSS and JavaScript""]",[]
deploying:deploying,deploying,deploying,Deploying Datasette,"The quickest way to deploy a Datasette instance on the internet is to use the datasette publish command, described in Publishing data . This can be used to quickly deploy Datasette to a number of hosting providers including Heroku, Google Cloud Run and Vercel.
You can deploy Datasette to other hosting providers using the instructions on this page.",[],[]
deploying:deploying-fundamentals,deploying,deploying-fundamentals,Deployment fundamentals,"Datasette can be deployed as a single datasette process that listens on a port. Datasette is not designed to be run as root, so that process should listen on a higher port such as port 8000.
If you want to serve Datasette on port 80 (the HTTP default port) or port 443 (for HTTPS) you should run it behind a proxy server, such as nginx, Apache or HAProxy. The proxy server can listen on port 80/443 and forward traffic on to Datasette.","[""Deploying Datasette""]",[]
deploying:deploying-proxy,deploying,deploying-proxy,Running Datasette behind a proxy,"You may wish to run Datasette behind an Apache or nginx proxy, using a path within your existing site.
You can use the base_url configuration setting to tell Datasette to serve traffic with a specific URL prefix. For example, you could run Datasette like this:
datasette my-database.db --setting base_url /my-datasette/ -p 8009
This will run Datasette with the following URLs:
http://127.0.0.1:8009/my-datasette/ - the Datasette homepage
http://127.0.0.1:8009/my-datasette/my-database - the page for the my-database.db database
http://127.0.0.1:8009/my-datasette/my-database/some_table - the page for the some_table table
You can now set your nginx or Apache server to proxy the /my-datasette/ path to this Datasette instance.","[""Deploying Datasette""]",[]
deploying:deploying-systemd,deploying,deploying-systemd,Running Datasette using systemd,"You can run Datasette on Ubuntu or Debian systems using systemd .
First, ensure you have Python 3 and pip installed. On Ubuntu you can use sudo apt-get install python3 python3-pip .
You can install Datasette into a virtual environment, or you can install it system-wide. To install system-wide, use sudo pip3 install datasette .
Now create a folder for your Datasette databases, for example using mkdir /home/ubuntu/datasette-root .
You can copy a test database into that folder like so:
cd /home/ubuntu/datasette-root
curl -O https://latest.datasette.io/fixtures.db
Create a file at /etc/systemd/system/datasette.service with the following contents:
[Unit]
Description=Datasette
After=network.target
[Service]
Type=simple
User=ubuntu
Environment=DATASETTE_SECRET=
WorkingDirectory=/home/ubuntu/datasette-root
ExecStart=datasette serve . -h 127.0.0.1 -p 8000
Restart=on-failure
[Install]
WantedBy=multi-user.target
Add a random value for the DATASETTE_SECRET - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so:
$ python3 -c 'import secrets; print(secrets.token_hex(32))'
This configuration will run Datasette against all database files contained in the /home/ubuntu/datasette-root directory. If that directory contains a metadata.yml (or .json ) file or a templates/ or plugins/ sub-directory those will automatically be loaded by Datasette - see Configuration directory mode for details.
You can start the Datasette process running using the following:
sudo systemctl daemon-reload
sudo systemctl start datasette.service
You will need to restart the Datasette service after making changes to its metadata.json configuration or adding a new database file to that directory. You can do that using:
sudo systemctl restart datasette.service
Once the service has started you can confirm that Datasette is running on port 8000 like so:
curl 127.0.0.1:8000/-/versions.json
# Should output JSON showing the installed version
Datasette will not be accessible from outside the server because it is listening on 127.0.0.1 . You can expose it by instead listening on 0.0.0.0 , but a better way is to set up a proxy such as nginx - see Running Datasette behind a proxy .","[""Deploying Datasette""]",[]
facets:facets-in-query-strings,facets,facets-in-query-strings,Facets in query strings,"To turn on faceting for specific columns on a Datasette table view, add one or more _facet=COLUMN parameters to the URL.
For example, if you want to turn on facets for the city_id and state columns, construct a URL that looks like this:
/dbname/tablename?_facet=state&_facet=city_id
This works for both the HTML interface and the .json view.
When enabled, facets will cause a facet_results block to be added to the JSON output, looking something like this:
{
""state"": {
""name"": ""state"",
""results"": [
{
""value"": ""CA"",
""label"": ""CA"",
""count"": 10,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=CA"",
""selected"": false
},
{
""value"": ""MI"",
""label"": ""MI"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=MI"",
""selected"": false
},
{
""value"": ""MC"",
""label"": ""MC"",
""count"": 1,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=MC"",
""selected"": false
}
],
""truncated"": false
}
""city_id"": {
""name"": ""city_id"",
""results"": [
{
""value"": 1,
""label"": ""San Francisco"",
""count"": 6,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=1"",
""selected"": false
},
{
""value"": 2,
""label"": ""Los Angeles"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=2"",
""selected"": false
},
{
""value"": 3,
""label"": ""Detroit"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=3"",
""selected"": false
},
{
""value"": 4,
""label"": ""Memnonia"",
""count"": 1,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=4"",
""selected"": false
}
],
""truncated"": false
}
}
If Datasette detects that a column is a foreign key, the ""label"" property will be automatically derived from the detected label column on the referenced table.
The default number of facet results returned is 30, controlled by the default_facet_size setting.
You can increase this on an individual page by adding ?_facet_size=100 to the query string, up to a maximum of max_returned_rows (which defaults to 1000).","[""Facets""]",[]
facets:facets-metadata,facets,facets-metadata,Facets in metadata.json,"You can turn facets on by default for specific tables by adding them to a ""facets"" key in a Datasette Metadata file.
Here's an example that turns on faceting by default for the qLegalStatus column in the Street_Tree_List table in the sf-trees database:
{
""databases"": {
""sf-trees"": {
""tables"": {
""Street_Tree_List"": {
""facets"": [""qLegalStatus""]
}
}
}
}
}
Facets defined in this way will always be shown in the interface and returned in the API, regardless of the _facet arguments passed to the view.
You can specify array or date facets in metadata using JSON objects with a single key of array or date and a value specifying the column, like this:
{
""facets"": [
{""array"": ""tags""},
{""date"": ""created""}
]
}
You can change the default facet size (the number of results shown for each facet) for a table using facet_size :
{
""databases"": {
""sf-trees"": {
""tables"": {
""Street_Tree_List"": {
""facets"": [""qLegalStatus""],
""facet_size"": 10
}
}
}
}
}","[""Facets""]",[]
facets:suggested-facets,facets,suggested-facets,Suggested facets,"Datasette's table UI will suggest facets for the user to apply, based on the following criteria:
For the currently filtered data are there any columns which, if applied as a facet...
Will return 30 or less unique options
Will return more than one unique option
Will return less unique options than the total number of filtered rows
And the query used to evaluate this criteria can be completed in under 50ms
That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries.
This means suggested facets are unlikely to appear for tables with millions of records in them.","[""Facets""]",[]
full_text_search:full-text-search-fts-versions,full_text_search,full-text-search-fts-versions,FTS versions,"There are three different versions of the SQLite FTS module: FTS3, FTS4 and FTS5. You can tell which versions are supported by your instance of Datasette by checking the /-/versions page.
FTS5 is the most advanced module but may not be available in the SQLite version that is bundled with your Python installation. Most importantly, FTS5 is the only version that has the ability to order by search relevance without needing extra code.
If you can't be sure that FTS5 will be available, you should use FTS4.","[""Full-text search""]",[]
getting_started:getting-started,getting_started,getting-started,Getting started,,[],[]
index:contents,index,contents,Contents,"Getting started Play with a live demo Follow a tutorial Datasette in your browser with Datasette Lite Try Datasette without installing anything using Glitch Using Datasette on your own computer Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker A note about extensions The Datasette Ecosystem sqlite-utils Dogsheep CLI reference datasette --help datasette serve datasette --get datasette serve --help-settings datasette plugins datasette install datasette uninstall datasette publish datasette publish cloudrun datasette publish heroku datasette package datasette inspect Pages and API endpoints Top-level index Database Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Running Datasette using OpenRC Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the ""root"" actor Permissions Defining permissions with ""allow"" blocks The /-/allow-debug tool Configuring permissions in metadata.json Controlling access to an instance Controlling access to specific databases Controlling access to specific tables and views Controlling access to specific canned queries Controlling the ability to execute arbitrary SQL Checking permissions in plugins actor_matches_allow() The permissions debug tool The ds_actor cookie Including an expiry time The /-/logout page Built-in permissions view-instance view-database view-database-download view-table view-query execute-sql permissions-debug debug-menu Performance and caching Immutable mode Using ""datasette inspect"" HTTP caching datasette-hashed-urls CSV export URL parameters Streaming all records Binary data Linking to binary downloads Binary plugins Facets Facets in query strings Facets in metadata.json Suggested facets Speeding up facets with indexes Facet by JSON array Facet by date Full-text search The table page and table view API Advanced SQLite search queries Configuring full-text search for a table or view Searches using custom SQL Enabling full-text search for a SQLite table Configuring FTS using sqlite-utils Configuring FTS using csvs-to-sqlite Configuring FTS by hand FTS versions SpatiaLite Warning Installation Installing SpatiaLite on OS X Installing SpatiaLite on Linux Spatial indexing latitude/longitude columns Making use of a spatial index Importing shapefiles into SpatiaLite Importing GeoJSON polygons using Shapely Querying polygons using within() Metadata Per-database and per-table metadata Source, license and about Column descriptions Specifying units for a column Setting a default sort order Setting a custom page size Setting which columns can be used for sorting Specifying the label column for a table Hiding tables Using YAML for metadata Settings Using --setting Configuration directory mode Settings default_allow_sql default_page_size sql_time_limit_ms max_returned_rows num_sql_threads allow_facet default_facet_size facet_time_limit_ms facet_suggest_time_limit_ms suggest_facets allow_download default_cache_ttl cache_size_kb allow_csv_stream max_csv_mb truncate_cells_html force_https_urls template_debug trace_debug base_url Configuring the secret Using secrets with datasette publish Introspection /-/metadata /-/versions /-/plugins /-/settings /-/databases /-/threads /-/actor /-/messages Custom pages and templates Custom CSS and JavaScript CSS classes on the Serving static files Publishing static assets Custom templates Custom pages Path parameters for pages Custom headers and status codes Returning 404s Custom redirects Custom error pages Plugins Installing plugins One-off plugins using --plugins-dir Deploying plugins using datasette publish Seeing what plugins are installed Plugin configuration Secret configuration values Writing plugins Writing one-off plugins Starting an installable plugin using cookiecutter Packaging a plugin Static assets Custom templates Writing plugins that accept configuration Designing URLs for your plugin Building URLs within plugins Plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env, datasette) extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(row, value, column, table, database, datasette) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) register_facet_classes() asgi_wrapper(datasette) startup(datasette) canned_queries(datasette, database, actor) actor_from_request(datasette, request) filters_from_request(request, database, table, datasette) permission_allowed(datasette, actor, action, resource) register_magic_parameters(datasette) forbidden(datasette, request, message) handle_exception(datasette, request, exception) menu_links(datasette, actor, request) table_actions(datasette, actor, database, table, request) database_actions(datasette, actor, database, request) skip_csrf(datasette, scope) get_metadata(datasette, key, database, table) Testing plugins Setting up a Datasette test instance Using pdb for errors thrown inside Datasette Using pytest fixtures Testing outbound HTTP calls with pytest-httpx Registering a plugin for the duration of a test Internals for plugins Request object The MultiParams class Response class Returning a response with .asgi_send(send) Setting cookies with response.set_cookie() Datasette class .databases .plugin_config(plugin_name, database=None, table=None) await .render_template(template, context=None, request=None) await .permission_allowed(actor, action, resource=None, default=False) await .ensure_permissions(actor, permissions) await .check_visibility(actor, action=None, resource=None, permissions=None) .get_database(name) .add_database(db, name=None, route=None) .add_memory_database(name) .remove_database(name) .sign(value, namespace=""default"") .unsign(value, namespace=""default"") .add_message(request, message, type=datasette.INFO) .absolute_url(request, path) .setting(key) datasette.client datasette.urls Database class Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None) db.hash await db.execute(sql, ...) Results await db.execute_fn(fn) await db.execute_write(sql, params=None, block=True) await db.execute_write_script(sql, block=True) await db.execute_write_many(sql, params_seq, block=True) await db.execute_write_fn(fn, block=True) db.close() Database introspection CSRF protection The _internal database The datasette.utils module parse_metadata(content) await_me_maybe(value) Tilde encoding datasette.tracer Tracing child tasks Import shortcuts Contributing General guidelines Setting up a development environment Running the tests Using fixtures Debugging Code formatting Running Black blacken-docs Prettier Editing and building the documentation Running Cog Continuously deployed demo instances Release process Alpha and beta releases Releasing bug fixes from a branch Upgrading CodeMirror Changelog 0.65.1 (2024-12-28) 0.65 (2024-10-07) 0.64.8 (2024-06-21) 0.64.7 (2024-06-12) 0.64.6 (2023-12-22) 0.64.5 (2023-10-08) 0.64.4 (2023-09-21) 0.64.3 (2023-04-27) 0.64.2 (2023-03-08) 0.64.1 (2023-01-11) 0.64 (2023-01-09) 0.63.3 (2022-12-17) 0.63.2 (2022-11-18) 0.63.1 (2022-11-10) 0.63 (2022-10-27) Features Plugin hooks and internals Documentation 0.62 (2022-08-14) Features Plugin hooks Bug fixes Documentation 0.61.1 (2022-03-23) 0.61 (2022-03-23) 0.60.2 (2022-02-07) 0.60.1 (2022-01-20) 0.60 (2022-01-13) Plugins and internals Faceting Other small fixes 0.59.4 (2021-11-29) 0.59.3 (2021-11-20) 0.59.2 (2021-11-13) 0.59.1 (2021-10-24) 0.59 (2021-10-14) 0.58.1 (2021-07-16) 0.58 (2021-07-14) 0.57.1 (2021-06-08) 0.57 (2021-06-05) New features Bug fixes and other improvements 0.56.1 (2021-06-05) 0.56 (2021-03-28) 0.55 (2021-02-18) 0.54.1 (2021-02-02) 0.54 (2021-01-25) The _internal database Named in-memory database support JavaScript modules Code formatting with Black and Prettier Other changes 0.53 (2020-12-10) 0.52.5 (2020-12-09) 0.52.4 (2020-12-05) 0.52.3 (2020-12-03) 0.52.2 (2020-12-02) 0.52.1 (2020-11-29) 0.52 (2020-11-28) 0.51.1 (2020-10-31) 0.51 (2020-10-31) New visual design Plugins can now add links within Datasette Binary data URL building Running Datasette behind a proxy Smaller changes 0.50.2 (2020-10-09) 0.50.1 (2020-10-09) 0.50 (2020-10-09) 0.49.1 (2020-09-15) 0.49 (2020-09-14) 0.48 (2020-08-16) 0.47.3 (2020-08-15) 0.47.2 (2020-08-12) 0.47.1 (2020-08-11) 0.47 (2020-08-11) 0.46 (2020-08-09) 0.45 (2020-07-01) Magic parameters for canned queries Log out Better plugin documentation New plugin hooks Smaller changes 0.44 (2020-06-11) Authentication Permissions Writable canned queries Flash messages Signed values and secrets CSRF protection Cookie methods register_routes() plugin hooks Smaller changes The road to Datasette 1.0 0.43 (2020-05-28) 0.42 (2020-05-08) 0.41 (2020-05-06) 0.40 (2020-04-21) 0.39 (2020-03-24) 0.38 (2020-03-08) 0.37.1 (2020-03-02) 0.37 (2020-02-25) 0.36 (2020-02-21) 0.35 (2020-02-04) 0.34 (2020-01-29) 0.33 (2019-12-22) 0.32 (2019-11-14) 0.31.2 (2019-11-13) 0.31.1 (2019-11-12) 0.31 (2019-11-11) 0.30.2 (2019-11-02) 0.30.1 (2019-10-30) 0.30 (2019-10-18) 0.29.3 (2019-09-02) 0.29.2 (2019-07-13) 0.29.1 (2019-07-11) 0.29 (2019-07-07) ASGI New plugin hook: asgi_wrapper New plugin hook: extra_template_vars Secret plugin configuration options Facet by date Easier custom templates for table rows ?_through= for joins through many-to-many tables Small changes 0.28 (2019-05-19) Supporting databases that change Faceting improvements, and faceting plugins datasette publish cloudrun register_output_renderer plugins Medium changes Small changes 0.27.1 (2019-05-09) 0.27 (2019-01-31) 0.26.1 (2019-01-10) 0.26 (2019-01-02) 0.25.2 (2018-12-16) 0.25.1 (2018-11-04) 0.25 (2018-09-19) 0.24 (2018-07-23) 0.23.2 (2018-07-07) 0.23.1 (2018-06-21) 0.23 (2018-06-18) CSV export Foreign key expansions New configuration settings Control HTTP caching with ?_ttl= Improved support for SpatiaLite latest.datasette.io Miscellaneous 0.22.1 (2018-05-23) 0.22 (2018-05-20) 0.21 (2018-05-05) 0.20 (2018-04-20) 0.19 (2018-04-16) 0.18 (2018-04-14) 0.17 (2018-04-13) 0.16 (2018-04-13) 0.15 (2018-04-09) 0.14 (2017-12-09) 0.13 (2017-11-24) 0.12 (2017-11-16) 0.11 (2017-11-14) 0.10 (2017-11-14) 0.9 (2017-11-13) 0.8 (2017-11-13)","[""Datasette""]",[]
installation:id1,installation,id1,Installation,"If you just want to try Datasette out you don't need to install anything: see Try Datasette without installing anything using Glitch
There are two main options for installing Datasette. You can install it directly on to your machine, or you can install it using Docker.
If you want to start making contributions to the Datasette project by installing a copy that lets you directly modify the code, take a look at our guide to Setting up a development environment .
Basic installation
Datasette Desktop for Mac
Using Homebrew
Using pip
Advanced installation options
Using pipx
Installing plugins using pipx
Upgrading packages using pipx
Using Docker
Loading SpatiaLite
Installing plugins
A note about extensions",[],[]
installation:installation-advanced,installation,installation-advanced,Advanced installation options,,"[""Installation""]",[]
installation:installation-basic,installation,installation-basic,Basic installation,,"[""Installation""]",[]
installation:installation-extensions,installation,installation-extensions,A note about extensions,"SQLite supports extensions, such as SpatiaLite for geospatial operations.
These can be loaded using the --load-extension argument, like so:
datasette --load-extension=/usr/local/lib/mod_spatialite.dylib
Some Python installations do not include support for SQLite extensions. If this is the case you will see the following error when you attempt to load an extension:
Your Python installation does not have the ability to load SQLite extensions.
In some cases you may see the following error message instead:
AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension'
On macOS the easiest fix for this is to install Datasette using Homebrew:
brew install datasette
Use which datasette to confirm that datasette will run that version. The output should look something like this:
/usr/local/opt/datasette/bin/datasette
If you get a different location here such as /Library/Frameworks/Python.framework/Versions/3.10/bin/datasette you can run the following command to cause datasette to execute the Homebrew version instead:
alias datasette=$(echo $(brew --prefix datasette)/bin/datasette)
You can undo this operation using:
unalias datasette
If you need to run SQLite with extension support for other Python code, you can do so by install Python itself using Homebrew:
brew install python
Then executing Python using:
/usr/local/opt/python@3/libexec/bin/python
A more convenient way to work with this version of Python may be to use it to create a virtual environment:
/usr/local/opt/python@3/libexec/bin/python -m venv datasette-venv
Then activate it like this:
source datasette-venv/bin/activate
Now running python and pip will work against a version of Python 3 that includes support for SQLite extensions:
pip install datasette
which datasette
datasette --version","[""Installation""]",[]
installation:installing-plugins-using-pipx,installation,installing-plugins-using-pipx,Installing plugins using pipx,"You can install additional datasette plugins with pipx inject like so:
$ pipx inject datasette datasette-json-html
injected package datasette-json-html into venv datasette
done! ✨ 🌟 ✨
$ datasette plugins
[
{
""name"": ""datasette-json-html"",
""static"": false,
""templates"": false,
""version"": ""0.6""
}
]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[]
installation:upgrading-packages-using-pipx,installation,upgrading-packages-using-pipx,Upgrading packages using pipx,"You can upgrade your pipx installation to the latest release of Datasette using pipx upgrade datasette :
$ pipx upgrade datasette
upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette)
To upgrade a plugin within the pipx environment use pipx runpip datasette install -U name-of-plugin - like this:
% datasette plugins
[
{
""name"": ""datasette-vega"",
""static"": true,
""templates"": false,
""version"": ""0.6""
}
]
$ pipx runpip datasette install -U datasette-vega
Collecting datasette-vega
Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB)
|████████████████████████████████| 1.8 MB 2.0 MB/s
...
Installing collected packages: datasette-vega
Attempting uninstall: datasette-vega
Found existing installation: datasette-vega 0.6
Uninstalling datasette-vega-0.6:
Successfully uninstalled datasette-vega-0.6
Successfully installed datasette-vega-0.6.2
$ datasette plugins
[
{
""name"": ""datasette-vega"",
""static"": true,
""templates"": false,
""version"": ""0.6.2""
}
]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[]
internals:database-close,internals,database-close,db.close(),"Closes all of the open connections to file-backed databases. This is mainly intended to be used by large test suites, to avoid hitting limits on the number of open files.","[""Internals for plugins"", ""Database class""]",[]
internals:database-constructor,internals,database-constructor,"Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None)","The Database() constructor can be used by plugins, in conjunction with .add_database(db, name=None, route=None) , to create and register new databases.
The arguments are as follows:
ds - Datasette class (required)
The Datasette instance you are attaching this database to.
path - string
Path to a SQLite database file on disk.
is_mutable - boolean
Set this to False to cause Datasette to open the file in immutable mode.
is_memory - boolean
Use this to create non-shared memory connections.
memory_name - string or None
Use this to create a named in-memory database. Unlike regular memory databases these can be accessed by multiple threads and will persist an changes made to them for the lifetime of the Datasette server process.
The first argument is the datasette instance you are attaching to, the second is a path= , then is_mutable and is_memory are both optional arguments.","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute,internals,database-execute,"await db.execute(sql, ...)","Executes a SQL query against the database and returns the resulting rows (see Results ).
sql - string (required)
The SQL query to execute. This can include ? or :named parameters.
params - list or dict
A list or dictionary of values to use for the parameters. List for ? , dictionary for :named .
truncate - boolean
Should the rows returned by the query be truncated at the maximum page size? Defaults to True , set this to False to disable truncation.
custom_time_limit - integer ms
A custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a dataette.database.QueryInterrupted exception.
page_size - integer
Set a custom page size for truncation, over-riding the configured Datasette default.
log_sql_errors - boolean
Should any SQL errors be logged to the console in addition to being raised as an error? Defaults to True .","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-fn,internals,database-execute-fn,await db.execute_fn(fn),"Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await .
Example usage:
def get_version(conn):
return conn.execute(
""select sqlite_version()""
).fetchall()[0][0]
version = await db.execute_fn(get_version)","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-write,internals,database-execute-write,"await db.execute_write(sql, params=None, block=True)","SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received.
This method can be used to queue up a non-SELECT SQL query to be executed against a single write connection to the database.
You can pass additional SQL parameters as a tuple or dictionary.
The method will block until the operation is completed, and the return value will be the return from calling conn.execute(...) using the underlying sqlite3 Python library.
If you pass block=False this behaviour changes to ""fire and forget"" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-write-fn,internals,database-execute-write-fn,"await db.execute_write_fn(fn, block=True)","This method works like .execute_write() , but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection for as long as it is executing.
fn needs to be a regular function, not an async def function.
For example:
def delete_and_return_count(conn):
conn.execute(""delete from some_table where id > 5"")
return conn.execute(
""select count(*) from some_table""
).fetchone()[0]
try:
num_rows_left = await database.execute_write_fn(
delete_and_return_count
)
except Exception as e:
print(""An error occurred:"", e)
The value returned from await database.execute_write_fn(...) will be the return value from your function.
If your function raises an exception that exception will be propagated up to the await line.
If you specify block=False the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to .execute_write_fn() to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.","[""Internals for plugins"", ""Database class""]",[]
internals:database-hash,internals,database-hash,db.hash,"If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns None .","[""Internals for plugins"", ""Database class""]",[]
internals:datasette-absolute-url,internals,datasette-absolute-url,".absolute_url(request, path)","request - Request
The current Request object
path - string
A path, for example /dbname/table.json
Returns the absolute URL for the given path, including the protocol and host. For example:
absolute_url = datasette.absolute_url(
request, ""/dbname/table.json""
)
# Would return ""http://localhost:8001/dbname/table.json""
The current request object is used to determine the hostname and protocol that should be used for the returned URL. The force_https_urls configuration setting is taken into account.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-database,internals,datasette-add-database,".add_database(db, name=None, route=None)","db - datasette.database.Database instance
The database to be attached.
name - string, optional
The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name.
route - string, optional
This will be used in the URL path. If not specified, it will default to the same thing as the name .
The datasette.add_database(db) method lets you add a new database to the current Datasette instance.
The db parameter should be an instance of the datasette.database.Database class. For example:
from datasette.database import Database
datasette.add_database(
Database(
datasette,
path=""path/to/my-new-database.db"",
)
)
This will add a mutable database and serve it at /my-new-database .
Use is_mutable=False to add an immutable database.
.add_database() returns the Database instance, with its name set as the database.name attribute. Any time you are working with a newly added database you should use the return value of .add_database() , for example:
db = datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
await db.execute_write(
""CREATE TABLE foo(id integer primary key)""
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-memory-database,internals,datasette-add-memory-database,.add_memory_database(name),"Adds a shared in-memory database with the specified name:
datasette.add_memory_database(""statistics"")
This is a shortcut for the following:
from datasette.database import Database
datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
Using either of these pattern will result in the in-memory database being served at /statistics .","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-message,internals,datasette-add-message,".add_message(request, message, type=datasette.INFO)","request - Request
The current Request object
message - string
The message string
type - constant, optional
The message type - datasette.INFO , datasette.WARNING or datasette.ERROR
Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages cookie. This method adds a message to that cookie.
You can try out these messages (including the different visual styling of the three message types) using the /-/messages debugging tool.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-check-visibility,internals,datasette-check-visibility,"await .check_visibility(actor, action=None, resource=None, permissions=None)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string, optional
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
permissions - list of action strings or (action, resource) tuples, optional
Provide this instead of action and resource to check multiple permissions at once.
This convenience method can be used to answer the question ""should this item be considered private, in that it is visible to me but it is not visible to anonymous users?""
It returns a tuple of two booleans, (visible, private) . visible indicates if the actor can see this resource. private will be True if an anonymous user would not be able to view the resource.
This example checks if the user can access a specific table, and sets private so that a padlock icon can later be displayed:
visible, private = await self.ds.check_visibility(
request.actor,
action=""view-table"",
resource=(database, table),
)
The following example runs three checks in a row, similar to await .ensure_permissions(actor, permissions) . If any of the checks are denied before one of them is explicitly granted then visible will be False . private will be True if an anonymous user would not be able to view the resource.
visible, private = await self.ds.check_visibility(
request.actor,
permissions=[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-databases,internals,datasette-databases,.databases,"Property exposing a collections.OrderedDict of databases currently connected to Datasette.
The dictionary keys are the name of the database that is used in the URL - e.g. /fixtures would have a key of ""fixtures"" . The values are Database class instances.
All databases are listed, irrespective of user permissions. This means that the _internal database will always be listed here.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-ensure-permissions,internals,datasette-ensure-permissions,"await .ensure_permissions(actor, permissions)","actor - dictionary
The authenticated actor. This is usually request.actor .
permissions - list
A list of permissions to check. Each permission in that list can be a string action name or a 2-tuple of (action, resource) .
This method allows multiple permissions to be checked at once. It raises a datasette.Forbidden exception if any of the checks are denied before one of them is explicitly granted.
This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns True or not a single one of them returns False :
await self.ds.ensure_permissions(
request.actor,
[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-get-database,internals,datasette-get-database,.get_database(name),"name - string, optional
The name of the database - optional.
Returns the specified database object. Raises a KeyError if the database does not exist. Call this method without an argument to return the first connected database.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-permission-allowed,internals,datasette-permission-allowed,"await .permission_allowed(actor, action, resource=None, default=False)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
default - optional, True or False
Should this permission check be default allow or default deny.
Check if the given actor has permission to perform the given action on the given resource.
Some permission checks are carried out against rules defined in metadata.json , while other custom permissions may be decided by plugins that implement the permission_allowed(datasette, actor, action, resource) plugin hook.
If neither metadata.json nor any of the plugins provide an answer to the permission query the default argument will be returned.
See Built-in permissions for a full list of permission actions included in Datasette core.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-plugin-config,internals,datasette-plugin-config,".plugin_config(plugin_name, database=None, table=None)","plugin_name - string
The name of the plugin to look up configuration for. Usually this is something similar to datasette-cluster-map .
database - None or string
The database the user is interacting with.
table - None or string
The table the user is interacting with.
This method lets you read plugin configuration values that were set in metadata.json . See Writing plugins that accept configuration for full details of how this method should be used.
The return value will be the value from the configuration file - usually a dictionary.
If the plugin is not configured the return value will be None .","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-remove-database,internals,datasette-remove-database,.remove_database(name),"name - string
The name of the database to be removed.
This removes a database that has been previously added. name= is the unique name of that database.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-setting,internals,datasette-setting,.setting(key),"key - string
The name of the setting, e.g. base_url .
Returns the configured value for the specified setting . This can be a string, boolean or integer depending on the requested setting.
For example:
downloads_are_allowed = datasette.setting(""allow_download"")","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-unsign,internals,datasette-unsign,".unsign(value, namespace=""default"")","signed - any serializable type
The signed string that was created using .sign(value, namespace=""default"") .
namespace - string, optional
The alternative namespace, if one was used.
Returns the original, decoded object that was passed to .sign(value, namespace=""default"") . If the signature is not valid this raises a itsdangerous.BadSignature exception.","[""Internals for plugins"", ""Datasette class""]",[]
internals:internals,internals,internals,Internals for plugins,Many Plugin hooks are passed objects that provide access to internal Datasette functionality. The interface to these objects should not be considered stable with the exception of methods that are documented here.,[],[]
internals:internals-database,internals,internals-database,Database class,"Instances of the Database class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas.","[""Internals for plugins""]",[]
internals:internals-database-introspection,internals,internals-database-introspection,Database introspection,"The Database class also provides properties and methods for introspecting the database.
db.name - string
The name of the database - usually the filename without the .db prefix.
db.size - integer
The size of the database file in bytes. 0 for :memory: databases.
db.mtime_ns - integer or None
The last modification time of the database file in nanoseconds since the epoch. None for :memory: databases.
db.is_mutable - boolean
Is this database mutable, and allowed to accept writes?
db.is_memory - boolean
Is this database an in-memory database?
await db.attached_databases() - list of named tuples
Returns a list of additional databases that have been connected to this database using the SQLite ATTACH command. Each named tuple has fields seq , name and file .
await db.table_exists(table) - boolean
Check if a table called table exists.
await db.table_names() - list of strings
List of names of tables in the database.
await db.view_names() - list of strings
List of names of views in the database.
await db.table_columns(table) - list of strings
Names of columns in a specific table.
await db.table_column_details(table) - list of named tuples
Full details of the columns in a specific table. Each column is represented by a Column named tuple with fields cid (integer representing the column position), name (string), type (string, e.g. REAL or VARCHAR(30) ), notnull (integer 1 or 0), default_value (string or None), is_pk (integer 1 or 0).
await db.primary_keys(table) - list of strings
Names of the columns that are part of the primary key for this table.
await db.fts_table(table) - string or None
The name of the FTS table associated with this table, if one exists.
await db.label_column_for_table(table) - string or None
The label column that is associated with this table - either automatically detected or using the ""label_column"" key from Metadata , see Specifying the label column for a table .
await db.foreign_keys_for_table(table) - list of dictionaries
Details of columns in this table which are foreign keys to other tables. A list of dictionaries where each dictionary is shaped like this: {""column"": string, ""other_table"": string, ""other_column"": string} .
await db.hidden_table_names() - list of strings
List of tables which Datasette ""hides"" by default - usually these are tables associated with SQLite's full-text search feature, the SpatiaLite extension or tables hidden using the Hiding tables feature.
await db.get_table_definition(table) - string
Returns the SQL definition for the table - the CREATE TABLE statement and any associated CREATE INDEX statements.
await db.get_view_definition(view) - string
Returns the SQL definition of the named view.
await db.get_all_foreign_keys() - dictionary
Dictionary representing both incoming and outgoing foreign keys for this table. It has two keys, ""incoming"" and ""outgoing"" , each of which is a list of dictionaries with keys ""column"" , ""other_table"" and ""other_column"" . For example:
{
""incoming"": [],
""outgoing"": [
{
""other_table"": ""attraction_characteristic"",
""column"": ""characteristic_id"",
""other_column"": ""pk"",
},
{
""other_table"": ""roadside_attractions"",
""column"": ""attraction_id"",
""other_column"": ""pk"",
}
]
}","[""Internals for plugins"", ""Database class""]",[]
internals:internals-datasette,internals,internals-datasette,Datasette class,"This object is an instance of the Datasette class, passed to many plugin hooks as an argument called datasette .
You can create your own instance of this - for example to help write tests for a plugin - like so:
from datasette.app import Datasette
# With no arguments a single in-memory database will be attached
datasette = Datasette()
# The files= argument can load files from disk
datasette = Datasette(files=[""/path/to/my-database.db""])
# Pass metadata as a JSON dictionary like this
datasette = Datasette(
files=[""/path/to/my-database.db""],
metadata={
""databases"": {
""my-database"": {
""description"": ""This is my database""
}
}
},
)
Constructor parameters include:
files=[...] - a list of database files to open
immutables=[...] - a list of database files to open in immutable mode
metadata={...} - a dictionary of Metadata
config_dir=... - the configuration directory to use, stored in datasette.config_dir","[""Internals for plugins""]",[]
internals:internals-datasette-urls,internals,internals-datasette-urls,datasette.urls,"The datasette.urls object contains methods for building URLs to pages within Datasette. Plugins should use this to link to pages, since these methods take into account any base_url configuration setting that might be in effect.
datasette.urls.instance(format=None)
Returns the URL to the Datasette instance root page. This is usually ""/"" .
datasette.urls.path(path, format=None)
Takes a path and returns the full path, taking base_url into account.
For example, datasette.urls.path(""-/logout"") will return the path to the logout page, which will be ""/-/logout"" by default or /prefix-path/-/logout if base_url is set to /prefix-path/
datasette.urls.logout()
Returns the URL to the logout page, usually ""/-/logout""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.static_plugins(plugin_name, path)
Returns the URL of one of the static assets belonging to a plugin.
datasette.urls.static_plugins(""datasette_cluster_map"", ""datasette-cluster-map.js"") would return ""/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.database(database_name, format=None)
Returns the URL to a database page, for example ""/fixtures""
datasette.urls.table(database_name, table_name, format=None)
Returns the URL to a table page, for example ""/fixtures/facetable""
datasette.urls.query(database_name, query_name, format=None)
Returns the URL to a query page, for example ""/fixtures/pragma_cache_size""
These functions can be accessed via the {{ urls }} object in Datasette templates, for example:
Homepage
Fixtures database
facetable table
pragma_cache_size query
Use the format=""json"" (or ""csv"" or other formats supported by plugins) arguments to get back URLs to the JSON representation. This is the path with .json added on the end.
These methods each return a datasette.utils.PrefixedUrlString object, which is a subclass of the Python str type. This allows the logic that considers the base_url setting to detect if that prefix has already been applied to the path.","[""Internals for plugins"", ""Datasette class""]",[]
internals:internals-multiparams,internals,internals-multiparams,The MultiParams class,"request.args is a MultiParams object - a dictionary-like object which provides access to query string parameters that may have multiple values.
Consider the query string ?foo=1&foo=2&bar=3 - with two values for foo and one value for bar .
request.args[key] - string
Returns the first value for that key, or raises a KeyError if the key is missing. For the above example request.args[""foo""] would return ""1"" .
request.args.get(key) - string or None
Returns the first value for that key, or None if the key is missing. Pass a second argument to specify a different default, e.g. q = request.args.get(""q"", """") .
request.args.getlist(key) - list of strings
Returns the list of strings for that key. request.args.getlist(""foo"") would return [""1"", ""2""] in the above example. request.args.getlist(""bar"") would return [""3""] . If the key is missing an empty list will be returned.
request.args.keys() - list of strings
Returns the list of available keys - for the example this would be [""foo"", ""bar""] .
key in request.args - True or False
You can use if key in request.args to check if a key is present.
for key in request.args - iterator
This lets you loop through every available key.
len(request.args) - integer
Returns the number of keys.","[""Internals for plugins""]",[]
internals:internals-response,internals,internals-response,Response class,"The Response class can be returned from view functions that have been registered using the register_routes(datasette) hook.
The Response() constructor takes the following arguments:
body - string
The body of the response.
status - integer (optional)
The HTTP status - defaults to 200.
headers - dictionary (optional)
A dictionary of extra HTTP headers, e.g. {""x-hello"": ""world""} .
content_type - string (optional)
The content-type for the response. Defaults to text/plain .
For example:
from datasette.utils.asgi import Response
response = Response(
""This is XML"",
content_type=""application/xml; charset=utf-8"",
)
The quickest way to create responses is using the Response.text(...) , Response.html(...) , Response.json(...) or Response.redirect(...) helper methods:
from datasette.utils.asgi import Response
html_response = Response.html(""This is HTML"")
json_response = Response.json({""this_is"": ""json""})
text_response = Response.text(
""This will become utf-8 encoded text""
)
# Redirects are served as 302, unless you pass status=301:
redirect_response = Response.redirect(
""https://latest.datasette.io/""
)
Each of these responses will use the correct corresponding content-type - text/html; charset=utf-8 , application/json; charset=utf-8 or text/plain; charset=utf-8 respectively.
Each of the helper methods take optional status= and headers= arguments, documented above.","[""Internals for plugins""]",[]
internals:internals-response-asgi-send,internals,internals-response-asgi-send,Returning a response with .asgi_send(send),"In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook.
Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example:
async def require_authorization(scope, receive, send):
response = Response.text(
""401 Authorization Required"",
headers={
""www-authenticate"": 'Basic realm=""Datasette"", charset=""UTF-8""'
},
status=401,
)
await response.asgi_send(send)","[""Internals for plugins"", ""Response class""]",[]
internals:internals-response-set-cookie,internals,internals-response-set-cookie,Setting cookies with response.set_cookie(),"To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this:
def set_cookie(
self,
key,
value="""",
max_age=None,
expires=None,
path=""/"",
domain=None,
secure=False,
httponly=False,
samesite=""lax"",
): ...
You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication :
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""),
)
return response","[""Internals for plugins"", ""Response class""]",[]
|