id,page,ref,title,content,breadcrumbs,references
plugin_hooks:plugin-hook-filters-from-request,plugin_hooks,plugin-hook-filters-from-request,"filters_from_request(request, database, table, datasette)","request - Request object
The current HTTP request.
database - string
The name of the database.
table - string
The name of the table.
datasette - Datasette class
You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries.
This hook runs on the table page, and can influence the where clause of the SQL query used to populate that page, based on query string arguments on the incoming request.
The hook should return an instance of datasette.filters.FilterArguments which has one required and three optional arguments:
return FilterArguments(
where_clauses=[""id > :max_id""],
params={""max_id"": 5},
human_descriptions=[""max_id is greater than 5""],
extra_context={},
)
The arguments to the FilterArguments class constructor are as follows:
where_clauses - list of strings, required
A list of SQL fragments that will be inserted into the SQL query, joined by the and operator. These can include :named parameters which will be populated using data in params .
params - dictionary, optional
Additional keyword arguments to be used when the query is executed. These should match any :arguments in the where clauses.
human_descriptions - list of strings, optional
These strings will be included in the human-readable description at the top of the page and the page
.
extra_context - dictionary, optional
Additional context variables that should be made available to the table.html template when it is rendered.
This example plugin causes 0 results to be returned if ?_nothing=1 is added to the URL:
from datasette import hookimpl
from datasette.filters import FilterArguments
@hookimpl
def filters_from_request(self, request):
if request.args.get(""_nothing""):
return FilterArguments(
[""1 = 0""], human_descriptions=[""NOTHING""]
)
Example: datasette-leaflet-freedraw","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-leaflet-freedraw"", ""label"": ""datasette-leaflet-freedraw""}]"
plugin_hooks:plugin-hook-extra-template-vars,plugin_hooks,plugin-hook-extra-template-vars,"extra_template_vars(template, database, table, columns, view_name, request, datasette)","Extra template variables that should be made available in the rendered template context.
template - string
The template that is being rendered, e.g. database.html
database - string or None
The name of the database, or None if the page does not correspond to a database (e.g. the root page)
table - string or None
The name of the table, or None if the page does not correct to a table
columns - list of strings or None
The names of the database columns that will be displayed on this page. None if the page does not contain a table.
view_name - string
The name of the view being displayed. ( index , database , table , and row are the most important ones.)
request - Request object or None
The current HTTP request. This can be None if the request object is not available.
datasette - Datasette class
You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name)
This hook can return one of three different types:
Dictionary
If you return a dictionary its keys and values will be merged into the template context.
Function that returns a dictionary
If you return a function it will be executed. If it returns a dictionary those values will will be merged into the template context.
Function that returns an awaitable function that returns a dictionary
You can also return a function which returns an awaitable function which returns a dictionary.
Datasette runs Jinja2 in async mode , which means you can add awaitable functions to the template scope and they will be automatically awaited when they are rendered by the template.
Here's an example plugin that adds a ""user_agent"" variable to the template context containing the current request's User-Agent header:
@hookimpl
def extra_template_vars(request):
return {""user_agent"": request.headers.get(""user-agent"")}
This example returns an awaitable function which adds a list of hidden_table_names to the context:
@hookimpl
def extra_template_vars(datasette, database):
async def hidden_table_names():
if database:
db = datasette.databases[database]
return {
""hidden_table_names"": await db.hidden_table_names()
}
else:
return {}
return hidden_table_names
And here's an example which adds a sql_first(sql_query) function which executes a SQL statement and returns the first column of the first row of results:
@hookimpl
def extra_template_vars(datasette, database):
async def sql_first(sql, dbname=None):
dbname = (
dbname
or database
or next(iter(datasette.databases.keys()))
)
result = await datasette.execute(dbname, sql)
return result.rows[0][0]
return {""sql_first"": sql_first}
You can then use the new function in a template like so:
SQLite version: {{ sql_first(""select sqlite_version()"") }}
Examples: datasette-search-all , datasette-template-sql","[""Plugin hooks""]","[{""href"": ""https://jinja.palletsprojects.com/en/2.10.x/api/#async-support"", ""label"": ""async mode""}, {""href"": ""https://datasette.io/plugins/datasette-search-all"", ""label"": ""datasette-search-all""}, {""href"": ""https://datasette.io/plugins/datasette-template-sql"", ""label"": ""datasette-template-sql""}]"
plugin_hooks:plugin-hook-extra-js-urls,plugin_hooks,plugin-hook-extra-js-urls,"extra_js_urls(template, database, table, columns, view_name, request, datasette)","This takes the same arguments as extra_template_vars(...)
This works in the same way as extra_css_urls() but for JavaScript. You can
return a list of URLs, a list of dictionaries or an awaitable function that returns those things:
from datasette import hookimpl
@hookimpl
def extra_js_urls():
return [
{
""url"": ""https://code.jquery.com/jquery-3.3.1.slim.min.js"",
""sri"": ""sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo"",
}
]
You can also return URLs to files from your plugin's static/ directory, if
you have one:
@hookimpl
def extra_js_urls():
return [""/-/static-plugins/your-plugin/app.js""]
Note that your-plugin here should be the hyphenated plugin name - the name that is displayed in the list on the /-/plugins debug page.
If your code uses JavaScript modules you should include the ""module"": True key. See Custom CSS and JavaScript for more details.
@hookimpl
def extra_js_urls():
return [
{
""url"": ""/-/static-plugins/your-plugin/app.js"",
""module"": True,
}
]
Examples: datasette-cluster-map , datasette-vega","[""Plugin hooks""]","[{""href"": ""https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules"", ""label"": ""JavaScript modules""}, {""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://datasette.io/plugins/datasette-vega"", ""label"": ""datasette-vega""}]"
plugin_hooks:plugin-hook-extra-css-urls,plugin_hooks,plugin-hook-extra-css-urls,"extra_css_urls(template, database, table, columns, view_name, request, datasette)","This takes the same arguments as extra_template_vars(...)
Return a list of extra CSS URLs that should be included on the page. These can
take advantage of the CSS class hooks described in Custom pages and templates .
This can be a list of URLs:
from datasette import hookimpl
@hookimpl
def extra_css_urls():
return [
""https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css""
]
Or a list of dictionaries defining both a URL and an
SRI hash :
@hookimpl
def extra_css_urls():
return [
{
""url"": ""https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css"",
""sri"": ""sha384-9gVQ4dYFwwWSjIDZnLEWnxCjeSWFphJiwGPXr1jddIhOegiu1FwO5qRGvFXOdJZ4"",
}
]
This function can also return an awaitable function, useful if it needs to run any async code:
@hookimpl
def extra_css_urls(datasette):
async def inner():
db = datasette.get_database()
results = await db.execute(
""select url from css_files""
)
return [r[0] for r in results]
return inner
Examples: datasette-cluster-map , datasette-vega","[""Plugin hooks""]","[{""href"": ""https://www.srihash.org/"", ""label"": ""SRI hash""}, {""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://datasette.io/plugins/datasette-vega"", ""label"": ""datasette-vega""}]"
plugin_hooks:plugin-hook-extra-body-script,plugin_hooks,plugin-hook-extra-body-script,"extra_body_script(template, database, table, columns, view_name, request, datasette)","Extra JavaScript to be added to a element:
@hookimpl
def extra_body_script():
return {
""module"": True,
""script"": ""console.log('Your JavaScript goes here...')"",
}
This will add the following to the end of your page:
Example: datasette-cluster-map","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}]"
plugin_hooks:plugin-hook-database-actions,plugin_hooks,plugin-hook-database-actions,"database_actions(datasette, actor, database, request)","datasette - Datasette class
You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries.
actor - dictionary or None
The currently authenticated actor .
database - string
The name of the database.
request - Request object
The current HTTP request.
This hook is similar to table_actions(datasette, actor, database, table, request) but populates an actions menu on the database page.
Example: datasette-graphql","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-graphql"", ""label"": ""datasette-graphql""}]"
plugin_hooks:plugin-hook-canned-queries,plugin_hooks,plugin-hook-canned-queries,"canned_queries(datasette, database, actor)","datasette - Datasette class
You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries.
database - string
The name of the database.
actor - dictionary or None
The currently authenticated actor .
Use this hook to return a dictionary of additional canned query definitions for the specified database. The return value should be the same shape as the JSON described in the canned query documentation.
from datasette import hookimpl
@hookimpl
def canned_queries(datasette, database):
if database == ""mydb"":
return {
""my_query"": {
""sql"": ""select * from my_table where id > :min_id""
}
}
The hook can alternatively return an awaitable function that returns a list. Here's an example that returns queries that have been stored in the saved_queries database table, if one exists:
from datasette import hookimpl
@hookimpl
def canned_queries(datasette, database):
async def inner():
db = datasette.get_database(database)
if await db.table_exists(""saved_queries""):
results = await db.execute(
""select name, sql from saved_queries""
)
return {
result[""name""]: {""sql"": result[""sql""]}
for result in results
}
return inner
The actor parameter can be used to include the currently authenticated actor in your decision. Here's an example that returns saved queries that were saved by that actor:
from datasette import hookimpl
@hookimpl
def canned_queries(datasette, database, actor):
async def inner():
db = datasette.get_database(database)
if actor is not None and await db.table_exists(
""saved_queries""
):
results = await db.execute(
""select name, sql from saved_queries where actor_id = :id"",
{""id"": actor[""id""]},
)
return {
result[""name""]: {""sql"": result[""sql""]}
for result in results
}
return inner
Example: datasette-saved-queries","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-saved-queries"", ""label"": ""datasette-saved-queries""}]"
plugin_hooks:plugin-hook-actor-from-request,plugin_hooks,plugin-hook-actor-from-request,"actor_from_request(datasette, request)","datasette - Datasette class
You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries.
request - Request object
The current HTTP request.
This is part of Datasette's authentication and permissions system . The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request.
If it cannot authenticate an actor, it should return None . Otherwise it should return a dictionary representing that actor.
Here's an example that authenticates the actor based on an incoming API key:
from datasette import hookimpl
import secrets
SECRET_KEY = ""this-is-a-secret""
@hookimpl
def actor_from_request(datasette, request):
authorization = (
request.headers.get(""authorization"") or """"
)
expected = ""Bearer {}"".format(SECRET_KEY)
if secrets.compare_digest(authorization, expected):
return {""id"": ""bot""}
If you install this in your plugins directory you can test it like this:
$ curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json
Instead of returning a dictionary, this function can return an awaitable function which itself returns either None or a dictionary. This is useful for authentication functions that need to make a database query - for example:
from datasette import hookimpl
@hookimpl
def actor_from_request(datasette, request):
async def inner():
token = request.args.get(""_token"")
if not token:
return None
# Look up ?_token=xxx in sessions table
result = await datasette.get_database().execute(
""select count(*) from sessions where token = ?"",
[token],
)
if result.first()[0]:
return {""token"": token}
else:
return None
return inner
Example: datasette-auth-tokens","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-auth-tokens"", ""label"": ""datasette-auth-tokens""}]"
plugin_hooks:plugin-asgi-wrapper,plugin_hooks,plugin-asgi-wrapper,asgi_wrapper(datasette),"Return an ASGI middleware wrapper function that will be applied to the Datasette ASGI application.
This is a very powerful hook. You can use it to manipulate the entire Datasette response, or even to configure new URL routes that will be handled by your own custom code.
You can write your ASGI code directly against the low-level specification, or you can use the middleware utilities provided by an ASGI framework such as Starlette .
This example plugin adds a x-databases HTTP header listing the currently attached databases:
from datasette import hookimpl
from functools import wraps
@hookimpl
def asgi_wrapper(datasette):
def wrap_with_databases_header(app):
@wraps(app)
async def add_x_databases_header(
scope, receive, send
):
async def wrapped_send(event):
if event[""type""] == ""http.response.start"":
original_headers = (
event.get(""headers"") or []
)
event = {
""type"": event[""type""],
""status"": event[""status""],
""headers"": original_headers
+ [
[
b""x-databases"",
"", "".join(
datasette.databases.keys()
).encode(""utf-8""),
]
],
}
await send(event)
await app(scope, receive, wrapped_send)
return add_x_databases_header
return wrap_with_databases_header
Examples: datasette-cors , datasette-pyinstrument , datasette-total-page-time","[""Plugin hooks""]","[{""href"": ""https://asgi.readthedocs.io/"", ""label"": ""ASGI""}, {""href"": ""https://www.starlette.io/middleware/"", ""label"": ""Starlette""}, {""href"": ""https://datasette.io/plugins/datasette-cors"", ""label"": ""datasette-cors""}, {""href"": ""https://datasette.io/plugins/datasette-pyinstrument"", ""label"": ""datasette-pyinstrument""}, {""href"": ""https://datasette.io/plugins/datasette-total-page-time"", ""label"": ""datasette-total-page-time""}]"
authentication:permissionsdebugview,authentication,permissionsdebugview,The permissions debug tool,"The debug tool at /-/permissions is only available to the authenticated root user (or any actor granted the permissions-debug action according to a plugin).
It shows the thirty most recent permission checks that have been carried out by the Datasette instance.
This is designed to help administrators and plugin authors understand exactly how permission checks are being carried out, in order to effectively configure Datasette's permission system.","[""Authentication and permissions""]",[]
authentication:permissions-view-table,authentication,permissions-view-table,view-table,"Actor is allowed to view a table (or view) page, e.g. https://latest.datasette.io/fixtures/complex_foreign_keys
resource - tuple: (string, string)
The name of the database, then the name of the table
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures/complex_foreign_keys"", ""label"": ""https://latest.datasette.io/fixtures/complex_foreign_keys""}]"
authentication:permissions-view-query,authentication,permissions-view-query,view-query,"Actor is allowed to view (and execute) a canned query page, e.g. https://latest.datasette.io/fixtures/pragma_cache_size - this includes executing Writable canned queries .
resource - tuple: (string, string)
The name of the database, then the name of the canned query
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures/pragma_cache_size"", ""label"": ""https://latest.datasette.io/fixtures/pragma_cache_size""}]"
authentication:permissions-view-instance,authentication,permissions-view-instance,view-instance,"Top level permission - Actor is allowed to view any pages within this instance, starting at https://latest.datasette.io/
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""https://latest.datasette.io/""}]"
authentication:permissions-view-database-download,authentication,permissions-view-database-download,view-database-download,"Actor is allowed to download a database, e.g. https://latest.datasette.io/fixtures.db
resource - string
The name of the database
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures.db"", ""label"": ""https://latest.datasette.io/fixtures.db""}]"
authentication:permissions-view-database,authentication,permissions-view-database,view-database,"Actor is allowed to view a database page, e.g. https://latest.datasette.io/fixtures
resource - string
The name of the database
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""https://latest.datasette.io/fixtures""}]"
authentication:permissions-plugins,authentication,permissions-plugins,Checking permissions in plugins,"Datasette plugins can check if an actor has permission to perform an action using the datasette.permission_allowed(...) method.
Datasette core performs a number of permission checks, documented below . Plugins can implement the permission_allowed(datasette, actor, action, resource) plugin hook to participate in decisions about whether an actor should be able to perform a specified action.","[""Authentication and permissions""]",[]
authentication:permissions-permissions-debug,authentication,permissions-permissions-debug,permissions-debug,"Actor is allowed to view the /-/permissions debug page.
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-execute-sql,authentication,permissions-execute-sql,execute-sql,"Actor is allowed to run arbitrary SQL queries against a specific database, e.g. https://latest.datasette.io/fixtures?sql=select+100
resource - string
The name of the database
Default allow . See also the default_allow_sql setting .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures?sql=select+100"", ""label"": ""https://latest.datasette.io/fixtures?sql=select+100""}]"
authentication:permissions-debug-menu,authentication,permissions-debug-menu,debug-menu,"Controls if the various debug pages are displayed in the navigation menu.
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
changelog:permissions,changelog,permissions,Permissions,"Datasette also now has a built-in concept of Permissions . The permissions system answers the following question:
Is this actor allowed to perform this action , optionally against this particular resource ?
You can use the new ""allow"" block syntax in metadata.json (or metadata.yaml ) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db database to the ""root"" user:
{
""databases"": {
""fixtures"": {
""allow"": {
""id"" ""root""
}
}
}
}
See Defining permissions with ""allow"" blocks for more details.
Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook.
A new debug page at /-/permissions shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug permission. ( #788 )","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette/issues/788"", ""label"": ""#788""}]"
performance:performance-inspect,performance,performance-inspect,"Using ""datasette inspect""","Counting the rows in a table can be a very expensive operation on larger databases. In immutable mode Datasette performs this count only once and caches the results, but this can still cause server startup time to increase by several seconds or more.
If you know that a database is never going to change you can precalculate the table row counts once and store then in a JSON file, then use that file when you later start the server.
To create a JSON file containing the calculated row counts for a database, use the following:
datasette inspect data.db --inspect-file=counts.json
Then later you can start Datasette against the counts.json file and use it to skip the row counting step and speed up server startup:
datasette -i data.db --inspect-file=counts.json
You need to use the -i immutable mode against the database file here or the counts from the JSON file will be ignored.
You will rarely need to use this optimization in every-day use, but several of the datasette publish commands described in Publishing data use this optimization for better performance when deploying a database file to a hosting provider.","[""Performance and caching""]",[]
performance:performance-immutable-mode,performance,performance-immutable-mode,Immutable mode,"If you can be certain that a SQLite database file will not be changed by another process you can tell Datasette to open that file in immutable mode .
Doing so will disable all locking and change detection, which can result in improved query performance.
This also enables further optimizations relating to HTTP caching, described below.
To open a file in immutable mode pass it to the datasette command using the -i option:
datasette -i data.db
When you open a file in immutable mode like this Datasette will also calculate and cache the row counts for each table in that database when it first starts up, further improving performance.","[""Performance and caching""]",[]
performance:performance-hashed-urls,performance,performance-hashed-urls,datasette-hashed-urls,"If you open a database file in immutable mode using the -i option, you can be assured that the content of that database will not change for the lifetime of the Datasette server.
The datasette-hashed-urls plugin implements an optimization where your database is served with part of the SHA-256 hash of the database contents baked into the URL.
A database at /fixtures will instead be served at /fixtures-aa7318b , and a year-long cache expiry header will be returned with those pages.
This will then be cached by both browsers and caching proxies such as Cloudflare or Fastly, providing a potentially significant performance boost.
To install the plugin, run the following:
datasette install datasette-hashed-urls
Prior to Datasette 0.61 hashed URL mode was a core Datasette feature, enabled using the hash_urls setting. This implementation has now been removed in favor of the datasette-hashed-urls plugin.
Prior to Datasette 0.28 hashed URL mode was the default behaviour for Datasette, since all database files were assumed to be immutable and unchanging. From 0.28 onwards the default has been to treat database files as mutable unless explicitly configured otherwise.","[""Performance and caching""]","[{""href"": ""https://datasette.io/plugins/datasette-hashed-urls"", ""label"": ""datasette-hashed-urls plugin""}]"
performance:performance,performance,performance,Performance and caching,"Datasette runs on top of SQLite, and SQLite has excellent performance. For small databases almost any query should return in just a few milliseconds, and larger databases (100s of MBs or even GBs of data) should perform extremely well provided your queries make sensible use of database indexes.
That said, there are a number of tricks you can use to improve Datasette's performance.",[],[]
metadata:per-database-and-per-table-metadata,metadata,per-database-and-per-table-metadata,Per-database and per-table metadata,"Metadata at the top level of the JSON will be shown on the index page and in the
footer on every page of the site. The license and source is expected to apply to
all of your data.
You can also provide metadata at the per-database or per-table level, like this:
{
""databases"": {
""database1"": {
""source"": ""Alternative source"",
""source_url"": ""http://example.com/"",
""tables"": {
""example_table"": {
""description_html"": ""Custom table description"",
""license"": ""CC BY 3.0 US"",
""license_url"": ""https://creativecommons.org/licenses/by/3.0/us/""
}
}
}
}
}
Each of the top-level metadata fields can be used at the database and table level.","[""Metadata""]",[]
pages:pages,pages,pages,Pages and API endpoints,"The Datasette web application offers a number of different pages that can be accessed to explore the data in question, each of which is accompanied by an equivalent JSON API.",[],[]
changelog:other-small-fixes,changelog,other-small-fixes,Other small fixes,"Made several performance improvements to the database schema introspection code that runs when Datasette first starts up. ( #1555 )
Label columns detected for foreign keys are now case-insensitive, so Name or TITLE will be detected in the same way as name or title . ( #1544 )
Upgraded Pluggy dependency to 1.0. ( #1575 )
Now using Plausible analytics for the Datasette documentation.
explain query plan is now allowed with varying amounts of whitespace in the query. ( #1588 )
New CLI reference page showing the output of --help for each of the datasette sub-commands. This lead to several small improvements to the help copy. ( #1594 )
Fixed bug where writable canned queries could not be used with custom templates. ( #1547 )
Improved fix for a bug where columns with a underscore prefix could result in unnecessary hidden form fields. ( #1527 )","[""Changelog"", ""0.60 (2022-01-13)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1555"", ""label"": ""#1555""}, {""href"": ""https://github.com/simonw/datasette/issues/1544"", ""label"": ""#1544""}, {""href"": ""https://github.com/simonw/datasette/issues/1575"", ""label"": ""#1575""}, {""href"": ""https://plausible.io/"", ""label"": ""Plausible analytics""}, {""href"": ""https://github.com/simonw/datasette/issues/1588"", ""label"": ""#1588""}, {""href"": ""https://github.com/simonw/datasette/issues/1594"", ""label"": ""#1594""}, {""href"": ""https://github.com/simonw/datasette/issues/1547"", ""label"": ""#1547""}, {""href"": ""https://github.com/simonw/datasette/issues/1527"", ""label"": ""#1527""}]"
changelog:other-changes,changelog,other-changes,Other changes,"Datasette can now open multiple database files with the same name, e.g. if you run datasette path/to/one.db path/to/other/one.db . ( #509 )
datasette publish cloudrun now sets force_https_urls for every deployment, fixing some incorrect http:// links. ( #1178 )
Fixed a bug in the example nginx configuration in Running Datasette behind a proxy . ( #1091 )
The Datasette Ecosystem documentation page has been reduced in size in favour of the datasette.io tools and plugins directories. ( #1182 )
The request object now provides a request.full_path property, which returns the path including any query string. ( #1184 )
Better error message for disallowed PRAGMA clauses in SQL queries. ( #1185 )
datasette publish heroku now deploys using python-3.8.7 .
New plugin testing documentation on Testing outbound HTTP calls with pytest-httpx . ( #1198 )
All ?_* query string parameters passed to the table page are now persisted in hidden form fields, so parameters such as ?_size=10 will be correctly passed to the next page when query filters are changed. ( #1194 )
Fixed a bug loading a database file called test-database (1).sqlite . ( #1181 )","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/509"", ""label"": ""#509""}, {""href"": ""https://github.com/simonw/datasette/issues/1178"", ""label"": ""#1178""}, {""href"": ""https://github.com/simonw/datasette/issues/1091"", ""label"": ""#1091""}, {""href"": ""https://datasette.io/tools"", ""label"": ""tools""}, {""href"": ""https://datasette.io/plugins"", ""label"": ""plugins""}, {""href"": ""https://github.com/simonw/datasette/issues/1182"", ""label"": ""#1182""}, {""href"": ""https://github.com/simonw/datasette/issues/1184"", ""label"": ""#1184""}, {""href"": ""https://github.com/simonw/datasette/issues/1185"", ""label"": ""#1185""}, {""href"": ""https://github.com/simonw/datasette/issues/1198"", ""label"": ""#1198""}, {""href"": ""https://github.com/simonw/datasette/issues/1194"", ""label"": ""#1194""}, {""href"": ""https://github.com/simonw/datasette/issues/1181"", ""label"": ""#1181""}]"
plugins:one-off-plugins-using-plugins-dir,plugins,one-off-plugins-using-plugins-dir,One-off plugins using --plugins-dir,"You can also define one-off per-project plugins by saving them as plugin_name.py functions in a plugins/ folder and then passing that folder to datasette using the --plugins-dir option:
datasette mydb.db --plugins-dir=plugins/","[""Plugins"", ""Installing plugins""]",[]
deploying:nginx-proxy-configuration,deploying,nginx-proxy-configuration,Nginx proxy configuration,"Here is an example of an nginx configuration file that will proxy traffic to Datasette:
daemon off;
events {
worker_connections 1024;
}
http {
server {
listen 80;
location /my-datasette {
proxy_pass http://127.0.0.1:8009/my-datasette;
proxy_set_header Host $host;
}
}
}
You can also use the --uds option to Datasette to listen on a Unix domain socket instead of a port, configuring the nginx upstream proxy like this:
daemon off;
events {
worker_connections 1024;
}
http {
server {
listen 80;
location /my-datasette {
proxy_pass http://datasette/my-datasette;
proxy_set_header Host $host;
}
}
upstream datasette {
server unix:/tmp/datasette.sock;
}
}
Then run Datasette with datasette --uds /tmp/datasette.sock path/to/database.db --setting base_url /my-datasette/ .","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://nginx.org/"", ""label"": ""nginx""}]"
changelog:new-visual-design,changelog,new-visual-design,New visual design,"Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://twitter.com/natbat"", ""label"": ""Natalie Downe""}, {""href"": ""https://github.com/simonw/datasette/pull/1056"", ""label"": ""#1056""}]"
changelog:new-plugin-hooks,changelog,new-plugin-hooks,New plugin hooks,"register_magic_parameters(datasette) can be used to define new types of magic canned query parameters.
startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 )
canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 )
forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette-init"", ""label"": ""datasette-init""}, {""href"": ""https://github.com/simonw/datasette/issues/834"", ""label"": ""#834""}, {""href"": ""https://github.com/simonw/datasette-saved-queries"", ""label"": ""datasette-saved-queries""}, {""href"": ""https://github.com/simonw/datasette/issues/852"", ""label"": ""#852""}, {""href"": ""https://github.com/simonw/datasette/issues/812"", ""label"": ""#812""}]"
changelog:new-plugin-hook-extra-template-vars,changelog,new-plugin-hook-extra-template-vars,New plugin hook: extra_template_vars,"The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://github.com/simonw/datasette/issues/540"", ""label"": ""#540""}]"
changelog:new-plugin-hook-asgi-wrapper,changelog,new-plugin-hook-asgi-wrapper,New plugin hook: asgi_wrapper,"The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 )
Two new plugins take advantage of this hook:
datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams .
datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://github.com/simonw/datasette/issues/520"", ""label"": ""#520""}, {""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://help.github.com/en/articles/about-organizations"", ""label"": ""organizations""}, {""href"": ""https://help.github.com/en/articles/organizing-members-into-teams"", ""label"": ""teams""}, {""href"": ""https://github.com/simonw/datasette-cors"", ""label"": ""datasette-cors""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS"", ""label"": ""CORS headers""}]"
changelog:new-features,changelog,new-features,New features,"If an error occurs while executing a user-provided SQL query, that query is now re-displayed in an editable form along with the error message. ( #619 )
New ?_col= and ?_nocol= parameters to show and hide columns in a table, plus an interface for hiding and showing columns in the column cog menu. ( #615 )
A new ?_facet_size= parameter for customizing the number of facet results returned on a table or view page. ( #1332 )
?_facet_size=max sets that to the maximum, which defaults to 1,000 and is controlled by the the max_returned_rows setting. If facet results are truncated the … at the bottom of the facet list now links to this parameter. ( #1337 )
?_nofacet=1 option to disable all facet calculations on a page, used as a performance optimization for CSV exports and ?_shape=array/object . ( #1349 , #263 )
?_nocount=1 option to disable full query result counts. ( #1353 )
?_trace=1 debugging option is now controlled by the new trace_debug setting, which is turned off by default. ( #1359 )","[""Changelog"", ""0.57 (2021-06-05)""]","[{""href"": ""https://github.com/simonw/datasette/issues/619"", ""label"": ""#619""}, {""href"": ""https://github.com/simonw/datasette/issues/615"", ""label"": ""#615""}, {""href"": ""https://github.com/simonw/datasette/issues/1332"", ""label"": ""#1332""}, {""href"": ""https://github.com/simonw/datasette/issues/1337"", ""label"": ""#1337""}, {""href"": ""https://github.com/simonw/datasette/issues/1349"", ""label"": ""#1349""}, {""href"": ""https://github.com/simonw/datasette/issues/263"", ""label"": ""#263""}, {""href"": ""https://github.com/simonw/datasette/issues/1353"", ""label"": ""#1353""}, {""href"": ""https://github.com/simonw/datasette/issues/1359"", ""label"": ""#1359""}]"
changelog:new-configuration-settings,changelog,new-configuration-settings,New configuration settings,"Datasette's Settings now also supports boolean settings. A number of new
configuration options have been added:
num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3.
allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on.
suggest_facets - should Datasette suggest facets? Defaults to on.
allow_download - should users be allowed to download the entire SQLite database? Defaults to on.
allow_sql - should users be allowed to execute custom SQL queries? Defaults to on.
default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0.
cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB.
allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on.
max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://www.sqlite.org/pragma.html#pragma_cache_size"", ""label"": ""per-connection cache""}]"
changelog:named-in-memory-database-support,changelog,named-in-memory-database-support,Named in-memory database support,"As part of the work building the _internal database, Datasette now supports named in-memory databases that can be shared across multiple connections. This allows plugins to create in-memory databases which will persist data for the lifetime of the Datasette server process. ( #1151 )
The new memory_name= parameter to the Database class can be used to create named, shared in-memory databases.","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1151"", ""label"": ""#1151""}]"
changelog:miscellaneous,changelog,miscellaneous,Miscellaneous,"Got JSON data in one of your columns? Use the new ?_json=COLNAME argument
to tell Datasette to return that JSON value directly rather than encoding it
as a string.
If you just want an array of the first value of each row, use the new
?_shape=arrayfirst option - example .","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://latest.datasette.io/fixtures.json?sql=select+neighborhood+from+facetable+order+by+pk+limit+101&_shape=arrayfirst"", ""label"": ""example""}]"
metadata:metadata-yaml,metadata,metadata-yaml,Using YAML for metadata,"Datasette accepts YAML as an alternative to JSON for your metadata configuration file. YAML is particularly useful for including multiline HTML and SQL strings.
Here's an example of a metadata.yml file, re-using an example from Canned queries .
title: Demonstrating Metadata from YAML
description_html: |-
This description includes a long HTML string
YAML is better for embedding HTML strings than JSON!
license: ODbL
license_url: https://opendatacommons.org/licenses/odbl/
databases:
fixtures:
tables:
no_primary_key:
hidden: true
queries:
neighborhood_search:
sql: |-
select neighborhood, facet_cities.name, state
from facetable join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%' order by neighborhood;
title: Search neighborhoods
description_html: |-
This demonstrates basic LIKE search
The metadata.yml file is passed to Datasette using the same --metadata option:
datasette fixtures.db --metadata metadata.yml","[""Metadata""]",[]
metadata:metadata-source-license-about,metadata,metadata-source-license-about,"Source, license and about","The three visible metadata fields you can apply to everything, specific databases or specific tables are source, license and about. All three are optional.
source and source_url should be used to indicate where the underlying data came from.
license and license_url should be used to indicate the license under which the data can be used.
about and about_url can be used to link to further information about the project - an accompanying blog entry for example.
For each of these you can provide just the *_url field and Datasette will treat that as the default link label text and display the URL directly on the page.","[""Metadata""]",[]
metadata:metadata-sortable-columns,metadata,metadata-sortable-columns,Setting which columns can be used for sorting,"Datasette allows any column to be used for sorting by default. If you need to
control which columns are available for sorting you can do so using the optional
sortable_columns key:
{
""databases"": {
""database1"": {
""tables"": {
""example_table"": {
""sortable_columns"": [
""height"",
""weight""
]
}
}
}
}
}
This will restrict sorting of example_table to just the height and
weight columns.
You can also disable sorting entirely by setting ""sortable_columns"": []
You can use sortable_columns to enable specific sort orders for a view called name_of_view in the database my_database like so:
{
""databases"": {
""my_database"": {
""tables"": {
""name_of_view"": {
""sortable_columns"": [
""clicks"",
""impressions""
]
}
}
}
}
}","[""Metadata""]",[]
metadata:metadata-page-size,metadata,metadata-page-size,Setting a custom page size,"Datasette defaults to displaying 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ""size"" key in metadata.json :
{
""databases"": {
""mydatabase"": {
""tables"": {
""example_table"": {
""size"": 10
}
}
}
}
}
This size can still be over-ridden by passing e.g. ?_size=50 in the query string.","[""Metadata""]",[]
metadata:metadata-hiding-tables,metadata,metadata-hiding-tables,Hiding tables,"You can hide tables from the database listing view (in the same way that FTS and
SpatiaLite tables are automatically hidden) using ""hidden"": true :
{
""databases"": {
""database1"": {
""tables"": {
""example_table"": {
""hidden"": true
}
}
}
}
}","[""Metadata""]",[]
metadata:metadata-default-sort,metadata,metadata-default-sort,Setting a default sort order,"By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ""sort"" or ""sort_desc"" metadata properties:
{
""databases"": {
""mydatabase"": {
""tables"": {
""example_table"": {
""sort"": ""created""
}
}
}
}
}
Or use ""sort_desc"" to sort in descending order:
{
""databases"": {
""mydatabase"": {
""tables"": {
""example_table"": {
""sort_desc"": ""created""
}
}
}
}
}","[""Metadata""]",[]
metadata:metadata-column-descriptions,metadata,metadata-column-descriptions,Column descriptions,"You can include descriptions for your columns by adding a ""columns"": {""name-of-column"": ""description-of-column""} block to your table metadata:
{
""databases"": {
""database1"": {
""tables"": {
""example_table"": {
""columns"": {
""column1"": ""Description of column 1"",
""column2"": ""Description of column 2""
}
}
}
}
}
}
These will be displayed at the top of the table page, and will also show in the cog menu for each column.
You can see an example of how these look at latest.datasette.io/fixtures/roadside_attractions .","[""Metadata""]","[{""href"": ""https://latest.datasette.io/fixtures/roadside_attractions"", ""label"": ""latest.datasette.io/fixtures/roadside_attractions""}]"
introspection:messagesdebugview,introspection,messagesdebugview,/-/messages,"The debug tool at /-/messages can be used to set flash messages to try out that feature. See .add_message(request, message, type=datasette.INFO) for details of this feature.","[""Introspection""]",[]
spatialite:making-use-of-a-spatial-index,spatialite,making-use-of-a-spatial-index,Making use of a spatial index,"SpatiaLite spatial indexes are R*Trees. They allow you to run efficient bounding box queries using a sub-select, with a similar pattern to that used for Searches using custom SQL .
In the above example, the resulting index will be called idx_museums_point_geom . This takes the form of a SQLite virtual table. You can inspect its contents using the following query:
select * from idx_museums_point_geom limit 10;
Here's a live example: timezones-api.datasette.io/timezones/idx_timezones_Geometry
pkid
xmin
xmax
ymin
ymax
1
-8.601725578308105
-2.4930307865142822
4.162120819091797
10.74019718170166
2
-3.2607860565185547
1.27329421043396
4.539252281188965
11.174856185913086
3
32.997581481933594
47.98238754272461
3.3974475860595703
14.894054412841797
4
-8.66890811920166
11.997337341308594
18.9681453704834
37.296207427978516
5
36.43336486816406
43.300174713134766
12.354820251464844
18.070993423461914
You can now construct efficient bounding box queries that will make use of the index like this:
select * from museums where museums.rowid in (
SELECT pkid FROM idx_museums_point_geom
-- left-hand-edge of point > left-hand-edge of bbox (minx)
where xmin > :bbox_minx
-- right-hand-edge of point < right-hand-edge of bbox (maxx)
and xmax < :bbox_maxx
-- bottom-edge of point > bottom-edge of bbox (miny)
and ymin > :bbox_miny
-- top-edge of point < top-edge of bbox (maxy)
and ymax < :bbox_maxy
);
Spatial indexes can be created against polygon columns as well as point columns, in which case they will represent the minimum bounding rectangle of that polygon. This is useful for accelerating within queries, as seen in the Timezones API example.","[""SpatiaLite""]","[{""href"": ""https://timezones-api.datasette.io/timezones/idx_timezones_Geometry"", ""label"": ""timezones-api.datasette.io/timezones/idx_timezones_Geometry""}]"
changelog:magic-parameters-for-canned-queries,changelog,magic-parameters-for-canned-queries,Magic parameters for canned queries,"Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example:
insert into logs
(user_id, timestamp)
values
(:_actor_id, :_now_datetime_utc)
This inserts the currently authenticated actor ID and the current datetime. ( #842 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/842"", ""label"": ""#842""}]"
authentication:logoutview,authentication,logoutview,The /-/logout page,The page at /-/logout provides the ability to log out of a ds_actor cookie authentication session.,"[""Authentication and permissions"", ""The ds_actor cookie""]",[]
changelog:log-out,changelog,log-out,Log out,"The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie.
A ""Log out"" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/840"", ""label"": ""#840""}]"
installation:loading-spatialite,installation,loading-spatialite,Loading SpatiaLite,"The datasetteproject/datasette image includes a recent version of the
SpatiaLite extension for SQLite. To load and enable that
module, use the following command:
docker run -p 8001:8001 -v `pwd`:/mnt \
datasetteproject/datasette \
datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \
--load-extension=spatialite
You can confirm that SpatiaLite is successfully loaded by visiting
http://127.0.0.1:8001/-/versions","[""Installation"", ""Advanced installation options"", ""Using Docker""]","[{""href"": ""http://127.0.0.1:8001/-/versions"", ""label"": ""http://127.0.0.1:8001/-/versions""}]"
changelog:latest-datasette-io,changelog,latest-datasette-io,latest.datasette.io,"Every commit to Datasette master is now automatically deployed by Travis CI to
https://latest.datasette.io/ - ensuring there is always a live demo of the
latest version of the software.
The demo uses the fixtures from our
unit tests, ensuring it demonstrates the same range of functionality that is
covered by the tests.
You can see how the deployment mechanism works in our .travis.yml file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""https://latest.datasette.io/""}, {""href"": ""https://github.com/simonw/datasette/blob/master/tests/fixtures.py"", ""label"": ""the fixtures""}, {""href"": ""https://github.com/simonw/datasette/blob/master/.travis.yml"", ""label"": "".travis.yml""}]"
metadata:label-columns,metadata,label-columns,Specifying the label column for a table,"Datasette's HTML interface attempts to display foreign key references as
labelled hyperlinks. By default, it looks for referenced tables that only have
two columns: a primary key column and one other. It assumes that the second
column should be used as the link label.
If your table has more than two columns you can specify which column should be
used for the link label with the label_column property:
{
""databases"": {
""database1"": {
""tables"": {
""example_table"": {
""label_column"": ""title""
}
}
}
}
}","[""Metadata""]",[]
introspection:jsondataview-versions,introspection,jsondataview-versions,/-/versions,"Shows the version of Datasette, Python and SQLite. Versions example :
{
""datasette"": {
""version"": ""0.60""
},
""python"": {
""full"": ""3.8.12 (default, Dec 21 2021, 10:45:09) \n[GCC 10.2.1 20210110]"",
""version"": ""3.8.12""
},
""sqlite"": {
""extensions"": {
""json1"": null
},
""fts_versions"": [
""FTS5"",
""FTS4"",
""FTS3""
],
""compile_options"": [
""COMPILER=gcc-6.3.0 20170516"",
""ENABLE_FTS3"",
""ENABLE_FTS4"",
""ENABLE_FTS5"",
""ENABLE_JSON1"",
""ENABLE_RTREE"",
""THREADSAFE=1""
],
""version"": ""3.37.0""
}
}","[""Introspection""]","[{""href"": ""https://latest.datasette.io/-/versions"", ""label"": ""Versions example""}]"
introspection:jsondataview-threads,introspection,jsondataview-threads,/-/threads,"Shows details of threads and asyncio tasks. Threads example :
{
""num_threads"": 2,
""threads"": [
{
""daemon"": false,
""ident"": 4759197120,
""name"": ""MainThread""
},
{
""daemon"": true,
""ident"": 123145319682048,
""name"": ""Thread-1""
},
],
""num_tasks"": 3,
""tasks"": [
"" cb=[set.discard()]>"",
"" wait_for=()]> cb=[run_until_complete..()]>"",
"" wait_for=()]>>""
]
}","[""Introspection""]","[{""href"": ""https://latest.datasette.io/-/threads"", ""label"": ""Threads example""}]"
introspection:jsondataview-plugins,introspection,jsondataview-plugins,/-/plugins,"Shows a list of currently installed plugins and their versions. Plugins example :
[
{
""name"": ""datasette_cluster_map"",
""static"": true,
""templates"": false,
""version"": ""0.10"",
""hooks"": [""extra_css_urls"", ""extra_js_urls"", ""extra_body_script""]
}
]
Add ?all=1 to include details of the default plugins baked into Datasette.","[""Introspection""]","[{""href"": ""https://san-francisco.datasettes.com/-/plugins"", ""label"": ""Plugins example""}]"
introspection:jsondataview-metadata,introspection,jsondataview-metadata,/-/metadata,"Shows the contents of the metadata.json file that was passed to datasette serve , if any. Metadata example :
{
""license"": ""CC Attribution 4.0 License"",
""license_url"": ""http://creativecommons.org/licenses/by/4.0/"",
""source"": ""fivethirtyeight/data on GitHub"",
""source_url"": ""https://github.com/fivethirtyeight/data"",
""title"": ""Five Thirty Eight"",
""databases"": {
}
}","[""Introspection""]","[{""href"": ""https://fivethirtyeight.datasettes.com/-/metadata"", ""label"": ""Metadata example""}]"
introspection:jsondataview-databases,introspection,jsondataview-databases,/-/databases,"Shows currently attached databases. Databases example :
[
{
""hash"": null,
""is_memory"": false,
""is_mutable"": true,
""name"": ""fixtures"",
""path"": ""fixtures.db"",
""size"": 225280
}
]","[""Introspection""]","[{""href"": ""https://latest.datasette.io/-/databases"", ""label"": ""Databases example""}]"
introspection:jsondataview-config,introspection,jsondataview-config,/-/settings,"Shows the Settings for this instance of Datasette. Settings example :
{
""default_facet_size"": 30,
""default_page_size"": 100,
""facet_suggest_time_limit_ms"": 50,
""facet_time_limit_ms"": 1000,
""max_returned_rows"": 1000,
""sql_time_limit_ms"": 1000
}","[""Introspection""]","[{""href"": ""https://fivethirtyeight.datasettes.com/-/settings"", ""label"": ""Settings example""}]"
introspection:jsondataview-actor,introspection,jsondataview-actor,/-/actor,"Shows the currently authenticated actor. Useful for debugging Datasette authentication plugins.
{
""actor"": {
""id"": 1,
""username"": ""some-user""
}
}","[""Introspection""]",[]
json_api:json-api-table-arguments,json_api,json-api-table-arguments,Special table arguments,"?_col=COLUMN1&_col=COLUMN2
List specific columns to display. These will be shown along with any primary keys.
?_nocol=COLUMN1&_nocol=COLUMN2
List specific columns to hide - any column not listed will be displayed. Primary keys cannot be hidden.
?_labels=on/off
Expand foreign key references for every possible column. See below.
?_label=COLUMN1&_label=COLUMN2
Expand foreign key references for one or more specified columns.
?_size=1000 or ?_size=max
Sets a custom page size. This cannot exceed the max_returned_rows limit
passed to datasette serve . Use max to get max_returned_rows .
?_sort=COLUMN
Sorts the results by the specified column.
?_sort_desc=COLUMN
Sorts the results by the specified column in descending order.
?_search=keywords
For SQLite tables that have been configured for
full-text search executes a search
with the provided keywords.
?_search_COLUMN=keywords
Like _search= but allows you to specify the column to be searched, as
opposed to searching all columns that have been indexed by FTS.
?_searchmode=raw
With this option, queries passed to ?_search= or ?_search_COLUMN= will
not have special characters escaped. This means you can make use of the full
set of advanced SQLite FTS syntax ,
though this could potentially result in errors if the wrong syntax is used.
?_where=SQL-fragment
If the execute-sql permission is enabled, this parameter
can be used to pass one or more additional SQL fragments to be used in the
WHERE clause of the SQL used to query the table.
This is particularly useful if you are building a JavaScript application
that needs to do something creative but still wants the other conveniences
provided by the table view (such as faceting) and hence would like not to
have to construct a completely custom SQL query.
Some examples:
facetable?_where=_neighborhood like ""%c%""&_where=_city_id=3
facetable?_where=_city_id in (select id from facet_cities where name != ""Detroit"")
?_through={json}
This can be used to filter rows via a join against another table.
The JSON parameter must include three keys: table , column and value .
table must be a table that the current table is related to via a foreign key relationship.
column must be a column in that other table.
value is the value that you want to match against.
For example, to filter roadside_attractions to just show the attractions that have a characteristic of ""museum"", you would construct this JSON:
{
""table"": ""roadside_attraction_characteristics"",
""column"": ""characteristic_id"",
""value"": ""1""
}
As a URL, that looks like this:
?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}
Here's an example .
?_next=TOKEN
Pagination by continuation token - pass the token that was returned in the
""next"" property by the previous page.
?_facet=column
Facet by column. Can be applied multiple times, see Facets . Only works on the default JSON output, not on any of the custom shapes.
?_facet_size=100
Increase the number of facet results returned for each facet. Use ?_facet_size=max for the maximum available size, determined by max_returned_rows .
?_nofacet=1
Disable all facets and facet suggestions for this page, including any defined by Facets in metadata.json .
?_nosuggest=1
Disable facet suggestions for this page.
?_nocount=1
Disable the select count(*) query used on this page - a count of None will be returned instead.","[""JSON API"", ""Table arguments""]","[{""href"": ""https://www.sqlite.org/fts3.html"", ""label"": ""full-text search""}, {""href"": ""https://www.sqlite.org/fts5.html#full_text_query_syntax"", ""label"": ""advanced SQLite FTS syntax""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_neighborhood%20like%20%22%c%%22&_where=_city_id=3"", ""label"": ""facetable?_where=_neighborhood like \""%c%\""&_where=_city_id=3""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)"", ""label"": ""facetable?_where=_city_id in (select id from facet_cities where name != \""Detroit\"")""}, {""href"": ""https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}"", ""label"": ""an example""}]"
json_api:json-api-special,json_api,json-api-special,Special JSON arguments,"Every Datasette endpoint that can return JSON also accepts the following
query string arguments:
?_shape=SHAPE
The shape of the JSON to return, documented above.
?_nl=on
When used with ?_shape=array produces newline-delimited JSON objects.
?_json=COLUMN1&_json=COLUMN2
If any of your SQLite columns contain JSON values, you can use one or more
_json= parameters to request that those columns be returned as regular
JSON. Without this argument those columns will be returned as JSON objects
that have been double-encoded into a JSON string value.
Compare this query without the argument to this query using the argument
?_json_infinity=on
If your data contains infinity or -infinity values, Datasette will replace
them with None when returning them as JSON. If you pass _json_infinity=1
Datasette will instead return them as Infinity or -Infinity which is
invalid JSON but can be processed by some custom JSON parsers.
?_timelimit=MS
Sets a custom time limit for the query in ms. You can use this for optimistic
queries where you would like Datasette to give up if the query takes too
long, for example if you want to implement autocomplete search but only if
it can be executed in less than 10ms.
?_ttl=SECONDS
For how many seconds should this response be cached by HTTP proxies? Use
?_ttl=0 to disable HTTP caching entirely for this request.
?_trace=1
Turns on tracing for this page: SQL queries executed during the request will
be gathered and included in the response, either in a new ""_traces"" key
for JSON responses or at the bottom of the page if the response is in HTML.
The structure of the data returned here should be considered highly unstable
and very likely to change.
Only available if the trace_debug setting is enabled.","[""JSON API""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array"", ""label"": ""this query without the argument""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array&_json=d"", ""label"": ""this query using the argument""}]"
json_api:json-api-shapes,json_api,json-api-shapes,Different shapes,"The default JSON representation of data from a SQLite table or custom query
looks like this:
{
""database"": ""sf-trees"",
""table"": ""qSpecies"",
""columns"": [
""id"",
""value""
],
""rows"": [
[
1,
""Myoporum laetum :: Myoporum""
],
[
2,
""Metrosideros excelsa :: New Zealand Xmas Tree""
],
[
3,
""Pinus radiata :: Monterey Pine""
]
],
""truncated"": false,
""next"": ""100"",
""next_url"": ""http://127.0.0.1:8001/sf-trees-02c8ef1/qSpecies.json?_next=100"",
""query_ms"": 1.9571781158447266
}
The columns key lists the columns that are being returned, and the rows
key then returns a list of lists, each one representing a row. The order of the
values in each row corresponds to the columns.
The _shape parameter can be used to access alternative formats for the
rows key which may be more convenient for your application. There are three
options:
?_shape=arrays - ""rows"" is the default option, shown above
?_shape=objects - ""rows"" is a list of JSON key/value objects
?_shape=array - an JSON array of objects
?_shape=array&_nl=on - a newline-separated list of JSON objects
?_shape=arrayfirst - a flat JSON array containing just the first value from each row
?_shape=object - a JSON object keyed using the primary keys of the rows
_shape=objects looks like this:
{
""database"": ""sf-trees"",
...
""rows"": [
{
""id"": 1,
""value"": ""Myoporum laetum :: Myoporum""
},
{
""id"": 2,
""value"": ""Metrosideros excelsa :: New Zealand Xmas Tree""
},
{
""id"": 3,
""value"": ""Pinus radiata :: Monterey Pine""
}
]
}
_shape=array looks like this:
[
{
""id"": 1,
""value"": ""Myoporum laetum :: Myoporum""
},
{
""id"": 2,
""value"": ""Metrosideros excelsa :: New Zealand Xmas Tree""
},
{
""id"": 3,
""value"": ""Pinus radiata :: Monterey Pine""
}
]
_shape=array&_nl=on looks like this:
{""id"": 1, ""value"": ""Myoporum laetum :: Myoporum""}
{""id"": 2, ""value"": ""Metrosideros excelsa :: New Zealand Xmas Tree""}
{""id"": 3, ""value"": ""Pinus radiata :: Monterey Pine""}
_shape=arrayfirst looks like this:
[1, 2, 3]
_shape=object looks like this:
{
""1"": {
""id"": 1,
""value"": ""Myoporum laetum :: Myoporum""
},
""2"": {
""id"": 2,
""value"": ""Metrosideros excelsa :: New Zealand Xmas Tree""
},
""3"": {
""id"": 3,
""value"": ""Pinus radiata :: Monterey Pine""
}
]
The object shape is only available for queries against tables - custom SQL
queries and views do not have an obvious primary key so cannot be returned using
this format.
The object keys are always strings. If your table has a compound primary
key, the object keys will be a comma-separated string.","[""JSON API""]",[]
json_api:json-api-pagination,json_api,json-api-pagination,Pagination,"The default JSON representation includes a ""next_url"" key which can be used to access the next page of results. If that key is null or missing then it means you have reached the final page of results.
Other representations include pagination information in the link HTTP header. That header will look something like this:
link: ; rel=""next""
Here is an example Python function built using requests that returns a list of all of the paginated items from one of these API endpoints:
def paginate(url):
items = []
while url:
response = requests.get(url)
try:
url = response.links.get(""next"").get(""url"")
except AttributeError:
url = None
items.extend(response.json())
return items","[""JSON API""]","[{""href"": ""https://requests.readthedocs.io/"", ""label"": ""requests""}]"
json_api:json-api-discover-alternate,json_api,json-api-discover-alternate,Discovering the JSON for a page,"Most of the HTML pages served by Datasette provide a mechanism for discovering their JSON equivalents using the HTML link mechanism.
You can find this near the top of the source code of those pages, looking like this:
The JSON URL is also made available in a Link HTTP header for the page:
Link: https://latest.datasette.io/fixtures/sortable.json; rel=""alternate""; type=""application/json+datasette""","[""JSON API""]",[]
changelog:javascript-modules,changelog,javascript-modules,JavaScript modules,"JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords.
To use modules, JavaScript needs to be included in
You can also specify a SRI (subresource integrity hash) for these assets:
{
""extra_css_urls"": [
{
""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"",
""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI""
}
],
""extra_js_urls"": [
{
""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"",
""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=""
}
]
}
This will produce:
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using www.srihash.org
Items in ""extra_js_urls"" can specify ""module"": true if they reference JavaScript that uses JavaScript modules . This configuration:
{
""extra_js_urls"": [
{
""url"": ""https://example.datasette.io/module.js"",
""module"": true
}
]
}
Will produce this HTML:
","[""Custom pages and templates""]","[{""href"": ""https://www.srihash.org/"", ""label"": ""www.srihash.org""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules"", ""label"": ""JavaScript modules""}]"
custom_templates:customization,custom_templates,customization,Custom pages and templates,Datasette provides a number of ways of customizing the way data is displayed.,[],[]
custom_templates:custom-pages-redirects,custom_templates,custom-pages-redirects,Custom redirects,"You can use the custom_redirect(location) function to redirect users to another page, for example in a file called pages/datasette.html :
{{ custom_redirect(""https://github.com/simonw/datasette"") }}
Now requests to http://localhost:8001/datasette will result in a redirect.
These redirects are served with a 302 Found status code by default. You can send a 301 Moved Permanently code by passing 301 as the second argument to the function:
{{ custom_redirect(""https://github.com/simonw/datasette"", 301) }}","[""Custom pages and templates"", ""Custom pages""]",[]
custom_templates:custom-pages-parameters,custom_templates,custom-pages-parameters,Path parameters for pages,"You can define custom pages that match multiple paths by creating files with {variable} definitions in their filenames.
For example, to capture any request to a URL matching /about/* , you would create a template in the following location:
templates/pages/about/{slug}.html
A hit to /about/news would render that template and pass in a variable called slug with a value of ""news"" .
If you use this mechanism don't forget to return a 404 if the referenced content could not be found. You can do this using {{ raise_404() }} described below.
Templates defined using custom page routes work particularly well with the sql() template function from datasette-template-sql or the graphql() template function from datasette-graphql .","[""Custom pages and templates"", ""Custom pages""]","[{""href"": ""https://github.com/simonw/datasette-template-sql"", ""label"": ""datasette-template-sql""}, {""href"": ""https://github.com/simonw/datasette-graphql#the-graphql-template-function"", ""label"": ""datasette-graphql""}]"
custom_templates:custom-pages-headers,custom_templates,custom-pages-headers,Custom headers and status codes,"Custom pages default to being served with a content-type of text/html; charset=utf-8 and a 200 status code. You can change these by calling a custom function from within your template.
For example, to serve a custom page with a 418 I'm a teapot HTTP status code, create a file in pages/teapot.html containing the following:
{{ custom_status(418) }}
Teapot
I'm a teapot
To serve a custom HTTP header, add a custom_header(name, value) function call. For example:
{{ custom_status(418) }}
{{ custom_header(""x-teapot"", ""I am"") }}
Teapot
I'm a teapot
You can verify this is working using curl like this:
$ curl -I 'http://127.0.0.1:8001/teapot'
HTTP/1.1 418
date: Sun, 26 Apr 2020 18:38:30 GMT
server: uvicorn
x-teapot: I am
content-type: text/html; charset=utf-8","[""Custom pages and templates"", ""Custom pages""]",[]
custom_templates:custom-pages-errors,custom_templates,custom-pages-errors,Custom error pages,"Datasette returns an error page if an unexpected error occurs, access is forbidden or content cannot be found.
You can customize the response returned for these errors by providing a custom error page template.
Content not found errors use a 404.html template. Access denied errors use 403.html . Invalid input errors use 400.html . Unexpected errors of other kinds use 500.html .
If a template for the specific error code is not found a template called error.html will be used instead. If you do not provide that template Datasette's default error.html template will be used.
The error template will be passed the following context:
status - integer
The integer HTTP status code, e.g. 404, 500, 403, 400.
error - string
Details of the specific error, usually a full sentence.
title - string or None
A title for the page representing the class of error. This is often None for errors that do not provide a title separate from their error message.","[""Custom pages and templates""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/datasette/templates/error.html"", ""label"": ""default error.html template""}]"
custom_templates:custom-pages-404,custom_templates,custom-pages-404,Returning 404s,"To indicate that content could not be found and display the default 404 page you can use the raise_404(message) function:
{% if not rows %}
{{ raise_404(""Content not found"") }}
{% endif %}
If you call raise_404() the other content in your template will be ignored.","[""Custom pages and templates"", ""Custom pages""]",[]
csv_export:csv-export-url-parameters,csv_export,csv-export-url-parameters,URL parameters,"The following options can be used to customize the CSVs returned by Datasette.
?_header=off
This removes the first row of the CSV file specifying the headings - only the row data will be returned.
?_stream=on
Stream all matching records, not just the first page of results. See below.
?_dl=on
Causes Datasette to return a content-disposition: attachment; filename=""filename.csv"" header.","[""CSV export""]",[]
changelog:csv-export,changelog,csv-export,CSV export,"Any Datasette table, view or custom SQL query can now be exported as CSV.
Check out the CSV export documentation for more details, or
try the feature out on
https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies
If your table has more than max_returned_rows (default 1,000)
Datasette provides the option to stream all rows . This option takes advantage
of async Python and Datasette's efficient pagination to
iterate through the entire matching result set and stream it back as a
downloadable CSV file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies"", ""label"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies""}]"
custom_templates:css-classes-on-the-body,custom_templates,css-classes-on-the-body,CSS classes on the ,"Every default template includes CSS classes in the body designed to support
custom styling.
The index template (the top level page at / ) gets this:
The database template ( /dbname ) gets this:
The custom SQL template ( /dbname?sql=... ) gets this:
A canned query template ( /dbname/queryname ) gets this:
The table template ( /dbname/tablename ) gets:
The row template ( /dbname/tablename/rowid ) gets:
The db-x and table-x classes use the database or table names themselves if
they are valid CSS identifiers. If they aren't, we strip any invalid
characters out and append a 6 character md5 digest of the original name, in
order to ensure that multiple tables which resolve to the same stripped
character version still have different CSS classes.
Some examples:
""simple"" => ""simple""
""MixedCase"" => ""MixedCase""
""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6""
""_no-leading-underscores"" => ""no-leading-underscores-b921bc""
""no spaces"" => ""no-spaces-7088d7""
""-"" => ""336d5e""
""no $ characters"" => ""no--characters-59e024""
and
elements also get custom CSS classes reflecting the
database column they are representing, for example:
","[""Custom pages and templates"", ""Custom CSS and JavaScript""]",[]
changelog:csrf-protection,changelog,csrf-protection,CSRF protection,"Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection ( #798 ). This applies automatically to any POST request, which means plugins need to include a csrftoken in any POST forms that they render. They can do that like so:
","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette/issues/798"", ""label"": ""#798""}]"
changelog:cookie-methods,changelog,cookie-methods,Cookie methods,"Plugins can now use the new response.set_cookie() method to set cookies.
A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.","[""Changelog"", ""0.44 (2020-06-11)""]",[]
changelog:control-http-caching-with-ttl,changelog,control-http-caching-with-ttl,Control HTTP caching with ?_ttl=,"You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter.
You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely.
Consider for example this query which returns a randomly selected member of the Avengers:
select * from [avengers/avengers] order by random() limit 1
If you hit the following page repeatedly you will get the same result, due to HTTP caching:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1
By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0""}]"
contributing:contributing-using-fixtures,contributing,contributing-using-fixtures,Using fixtures,"To run Datasette itself, type datasette .
You're going to need at least one SQLite database. A quick way to get started is to use the fixtures database that Datasette uses for its own tests.
You can create a copy of that database by running this command:
python tests/fixtures.py fixtures.db
Now you can run Datasette against the new fixtures database like so:
datasette fixtures.db
This will start a server at http://127.0.0.1:8001/ .
Any changes you make in the datasette/templates or datasette/static folder will be picked up immediately (though you may need to do a force-refresh in your browser to see changes to CSS or JavaScript).
If you want to change Datasette's Python code you can use the --reload option to cause Datasette to automatically reload any time the underlying code changes:
datasette --reload fixtures.db
You can also use the fixtures.py script to recreate the testing version of metadata.json used by the unit tests. To do that:
python tests/fixtures.py fixtures.db fixtures-metadata.json
Or to output the plugins used by the tests, run this:
python tests/fixtures.py fixtures.db fixtures-metadata.json fixtures-plugins
Test tables written to fixtures.db
- metadata written to fixtures-metadata.json
Wrote plugin: fixtures-plugins/register_output_renderer.py
Wrote plugin: fixtures-plugins/view_name.py
Wrote plugin: fixtures-plugins/my_plugin.py
Wrote plugin: fixtures-plugins/messages_output_renderer.py
Wrote plugin: fixtures-plugins/my_plugin_2.py
Then run Datasette like this:
datasette fixtures.db -m fixtures-metadata.json --plugins-dir=fixtures-plugins/","[""Contributing""]",[]
contributing:contributing-upgrading-codemirror,contributing,contributing-upgrading-codemirror,Upgrading CodeMirror,"Datasette bundles CodeMirror for the SQL editing interface, e.g. on this page . Here are the steps for upgrading to a new version of CodeMirror:
Download and extract latest CodeMirror zip file from https://codemirror.net/codemirror.zip
Rename lib/codemirror.js to codemirror-5.57.0.js (using latest version number)
Rename lib/codemirror.css to codemirror-5.57.0.css
Rename mode/sql/sql.js to codemirror-5.57.0-sql.js
Edit both JavaScript files to make the top license comment a /* */ block instead of multiple // lines
Minify the JavaScript files like this:
npx uglify-js codemirror-5.57.0.js -o codemirror-5.57.0.min.js --comments '/LICENSE/'
npx uglify-js codemirror-5.57.0-sql.js -o codemirror-5.57.0-sql.min.js --comments '/LICENSE/'
Check that the LICENSE comment did indeed survive minification
Minify the CSS file like this:
npx clean-css-cli codemirror-5.57.0.css -o codemirror-5.57.0.min.css
Edit the _codemirror.html template to reference the new files
git rm the old files, git add the new files","[""Contributing""]","[{""href"": ""https://codemirror.net/"", ""label"": ""CodeMirror""}, {""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""this page""}, {""href"": ""https://codemirror.net/codemirror.zip"", ""label"": ""https://codemirror.net/codemirror.zip""}]"
contributing:contributing-running-tests,contributing,contributing-running-tests,Running the tests,"Once you have done this, you can run the Datasette unit tests from inside your datasette/ directory using pytest like so:
pytest
You can run the tests faster using multiple CPU cores with pytest-xdist like this:
pytest -n auto -m ""not serial""
-n auto detects the number of available cores automatically. The -m ""not serial"" skips tests that don't work well in a parallel test environment. You can run those tests separately like so:
pytest -m ""serial""","[""Contributing""]","[{""href"": ""https://docs.pytest.org/"", ""label"": ""pytest""}, {""href"": ""https://pypi.org/project/pytest-xdist/"", ""label"": ""pytest-xdist""}]"
contributing:contributing-release,contributing,contributing-release,Release process,"Datasette releases are performed using tags. When a new release is published on GitHub, a GitHub Action workflow will perform the following:
Run the unit tests against all supported Python versions. If the tests pass...
Build a Docker image of the release and push a tag to https://hub.docker.com/r/datasetteproject/datasette
Re-point the ""latest"" tag on Docker Hub to the new image
Build a wheel bundle of the underlying Python source code
Push that new wheel up to PyPI: https://pypi.org/project/datasette/
To deploy new releases you will need to have push access to the main Datasette GitHub repository.
Datasette follows Semantic Versioning :
major.minor.patch
We increment major for backwards-incompatible releases. Datasette is currently pre-1.0 so the major version is always 0 .
We increment minor for new features.
We increment patch for bugfix releass.
Alpha and beta releases may have an additional a0 or b0 prefix - the integer component will be incremented with each subsequent alpha or beta.
To release a new version, first create a commit that updates the version number in datasette/version.py and the the changelog with highlights of the new version. An example commit can be seen here :
# Update changelog
git commit -m "" Release 0.51a1
Refs #1056, #1039, #998, #1045, #1033, #1036, #1034, #976, #1057, #1058, #1053, #1064, #1066"" -a
git push
Referencing the issues that are part of the release in the commit message ensures the name of the release shows up on those issue pages, e.g. here .
You can generate the list of issue references for a specific release by copying and pasting text from the release notes or GitHub changes-since-last-release view into this Extract issue numbers from pasted text tool.
To create the tag for the release, create a new release on GitHub matching the new version number. You can convert the release notes to Markdown by copying and pasting the rendered HTML into this Paste to Markdown tool .
Finally, post a news item about the release on datasette.io by editing the news.yaml file in that site's repository.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": ""GitHub Action workflow""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": ""https://hub.docker.com/r/datasetteproject/datasette""}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": ""https://pypi.org/project/datasette/""}, {""href"": ""https://semver.org/"", ""label"": ""Semantic Versioning""}, {""href"": ""https://github.com/simonw/datasette/commit/0e1e89c6ba3d0fbdb0823272952cf356f3016def"", ""label"": ""commit can be seen here""}, {""href"": ""https://github.com/simonw/datasette/issues/581#ref-commit-d56f402"", ""label"": ""here""}, {""href"": ""https://observablehq.com/@simonw/extract-issue-numbers-from-pasted-text"", ""label"": ""Extract issue numbers from pasted text""}, {""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""a new release""}, {""href"": ""https://euangoddard.github.io/clipboard2markdown/"", ""label"": ""Paste to Markdown tool""}, {""href"": ""https://datasette.io/"", ""label"": ""datasette.io""}, {""href"": ""https://github.com/simonw/datasette.io/blob/main/news.yaml"", ""label"": ""news.yaml""}]"
contributing:contributing-formatting-prettier,contributing,contributing-formatting-prettier,Prettier,"To install Prettier, install Node.js and then run the following in the root of your datasette repository checkout:
$ npm install
This will install Prettier in a node_modules directory. You can then check that your code matches the coding style like so:
$ npm run prettier -- --check
> prettier
> prettier 'datasette/static/*[!.min].js' ""--check""
Checking formatting...
[warn] datasette/static/plugins.js
[warn] Code style issues found in the above file(s). Forgot to run Prettier?
You can fix any problems by running:
$ npm run fix","[""Contributing"", ""Code formatting""]","[{""href"": ""https://nodejs.org/en/download/package-manager/"", ""label"": ""install Node.js""}]"
contributing:contributing-formatting-blacken-docs,contributing,contributing-formatting-blacken-docs,blacken-docs,"The blacken-docs command applies Black formatting rules to code examples in the documentation. Run it like this:
blacken-docs -l 60 docs/*.rst","[""Contributing"", ""Code formatting""]","[{""href"": ""https://pypi.org/project/blacken-docs/"", ""label"": ""blacken-docs""}]"
contributing:contributing-formatting-black,contributing,contributing-formatting-black,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout:
$ black . --check
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems:
$ black .
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.","[""Contributing"", ""Code formatting""]",[]
contributing:contributing-formatting,contributing,contributing-formatting,Code formatting,"Datasette uses opinionated code formatters: Black for Python and Prettier for JavaScript.
These formatters are enforced by Datasette's continuous integration: if a commit includes Python or JavaScript code that does not match the style enforced by those tools, the tests will fail.
When developing locally, you can verify and correct the formatting of your code using these tools.","[""Contributing""]","[{""href"": ""https://github.com/psf/black"", ""label"": ""Black""}, {""href"": ""https://prettier.io/"", ""label"": ""Prettier""}]"
contributing:contributing-documentation-cog,contributing,contributing-documentation-cog,Running Cog,"Some pages of documentation (in particular the CLI reference ) are automatically updated using Cog .
To update these pages, run the following command:
cog -r docs/*.rst","[""Contributing"", ""Editing and building the documentation""]","[{""href"": ""https://github.com/nedbat/cog"", ""label"": ""Cog""}]"
contributing:contributing-documentation,contributing,contributing-documentation,Editing and building the documentation,"Datasette's documentation lives in the docs/ directory and is deployed automatically using Read The Docs .
The documentation is written using reStructuredText. You may find this article on The subset of reStructuredText worth committing to memory useful.
You can build it locally by installing sphinx and sphinx_rtd_theme in your Datasette development environment and then running make html directly in the docs/ directory:
# You may first need to activate your virtual environment:
source venv/bin/activate
# Install the dependencies needed to build the docs
pip install -e .[docs]
# Now build the docs
cd docs/
make html
This will create the HTML version of the documentation in docs/_build/html . You can open it in your browser like so:
open _build/html/index.html
Any time you make changes to a .rst file you can re-run make html to update the built documents, then refresh them in your browser.
For added productivity, you can use use sphinx-autobuild to run Sphinx in auto-build mode. This will run a local webserver serving the docs that automatically rebuilds them and refreshes the page any time you hit save in your editor.
sphinx-autobuild will have been installed when you ran pip install -e .[docs] . In your docs/ directory you can start the server by running the following:
make livehtml
Now browse to http://localhost:8000/ to view the documentation. Any edits you make should be instantly reflected in your browser.","[""Contributing""]","[{""href"": ""https://readthedocs.org/"", ""label"": ""Read The Docs""}, {""href"": ""https://simonwillison.net/2018/Aug/25/restructuredtext/"", ""label"": ""The subset of reStructuredText worth committing to memory""}, {""href"": ""https://pypi.org/project/sphinx-autobuild/"", ""label"": ""sphinx-autobuild""}]"
contributing:contributing-debugging,contributing,contributing-debugging,Debugging,"Any errors that occur while Datasette is running while display a stack trace on the console.
You can tell Datasette to open an interactive pdb debugger session if an error occurs using the --pdb option:
datasette --pdb fixtures.db","[""Contributing""]",[]
contributing:contributing-continuous-deployment,contributing,contributing-continuous-deployment,Continuously deployed demo instances,"The demo instance at latest.datasette.io is re-deployed automatically to Google Cloud Run for every push to main that passes the test suite. This is implemented by the GitHub Actions workflow at .github/workflows/deploy-latest.yml .
Specific branches can also be set to automatically deploy by adding them to the on: push: branches block at the top of the workflow YAML file. Branches configured in this way will be deployed to a new Cloud Run service whether or not their tests pass.
The Cloud Run URL for a branch demo can be found in the GitHub Actions logs.","[""Contributing""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""latest.datasette.io""}, {""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": "".github/workflows/deploy-latest.yml""}]"
contributing:contributing-bug-fix-branch,contributing,contributing-bug-fix-branch,Releasing bug fixes from a branch,"If it's necessary to publish a bug fix release without shipping new features that have landed on main a release branch can be used.
Create it from the relevant last tagged release like so:
git branch 0.52.x 0.52.4
git checkout 0.52.x
Next cherry-pick the commits containing the bug fixes:
git cherry-pick COMMIT
Write the release notes in the branch, and update the version number in version.py . Then push the branch:
git push -u origin 0.52.x
Once the tests have completed, publish the release from that branch target using the GitHub Draft a new release form.
Finally, cherry-pick the commit with the release notes and version number bump across to main :
git checkout main
git cherry-pick COMMIT
git push","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""Draft a new release""}]"
contributing:contributing-alpha-beta,contributing,contributing-alpha-beta,Alpha and beta releases,"Alpha and beta releases are published to preview upcoming features that may not yet be stable - in particular to preview new plugin hooks.
You are welcome to try these out, but please be aware that details may change before the final release.
Please join discussions on the issue tracker to share your thoughts and experiences with on alpha and beta features that you try out.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/issues"", ""label"": ""discussions on the issue tracker""}]"
index:contents,index,contents,Contents,"Getting started Play with a live demo Follow a tutorial Datasette in your browser with Datasette Lite Try Datasette without installing anything using Glitch Using Datasette on your own computer Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker A note about extensions The Datasette Ecosystem sqlite-utils Dogsheep CLI reference datasette --help datasette serve datasette --get datasette serve --help-settings datasette plugins datasette install datasette uninstall datasette publish datasette publish cloudrun datasette publish heroku datasette package datasette inspect Pages and API endpoints Top-level index Database Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Running Datasette using OpenRC Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the ""root"" actor Permissions Defining permissions with ""allow"" blocks The /-/allow-debug tool Configuring permissions in metadata.json Controlling access to an instance Controlling access to specific databases Controlling access to specific tables and views Controlling access to specific canned queries Controlling the ability to execute arbitrary SQL Checking permissions in plugins actor_matches_allow() The permissions debug tool The ds_actor cookie Including an expiry time The /-/logout page Built-in permissions view-instance view-database view-database-download view-table view-query execute-sql permissions-debug debug-menu Performance and caching Immutable mode Using ""datasette inspect"" HTTP caching datasette-hashed-urls CSV export URL parameters Streaming all records Binary data Linking to binary downloads Binary plugins Facets Facets in query strings Facets in metadata.json Suggested facets Speeding up facets with indexes Facet by JSON array Facet by date Full-text search The table page and table view API Advanced SQLite search queries Configuring full-text search for a table or view Searches using custom SQL Enabling full-text search for a SQLite table Configuring FTS using sqlite-utils Configuring FTS using csvs-to-sqlite Configuring FTS by hand FTS versions SpatiaLite Warning Installation Installing SpatiaLite on OS X Installing SpatiaLite on Linux Spatial indexing latitude/longitude columns Making use of a spatial index Importing shapefiles into SpatiaLite Importing GeoJSON polygons using Shapely Querying polygons using within() Metadata Per-database and per-table metadata Source, license and about Column descriptions Specifying units for a column Setting a default sort order Setting a custom page size Setting which columns can be used for sorting Specifying the label column for a table Hiding tables Using YAML for metadata Settings Using --setting Configuration directory mode Settings default_allow_sql default_page_size sql_time_limit_ms max_returned_rows num_sql_threads allow_facet default_facet_size facet_time_limit_ms facet_suggest_time_limit_ms suggest_facets allow_download default_cache_ttl cache_size_kb allow_csv_stream max_csv_mb truncate_cells_html force_https_urls template_debug trace_debug base_url Configuring the secret Using secrets with datasette publish Introspection /-/metadata /-/versions /-/plugins /-/settings /-/databases /-/threads /-/actor /-/messages Custom pages and templates Custom CSS and JavaScript CSS classes on the Serving static files Publishing static assets Custom templates Custom pages Path parameters for pages Custom headers and status codes Returning 404s Custom redirects Custom error pages Plugins Installing plugins One-off plugins using --plugins-dir Deploying plugins using datasette publish Seeing what plugins are installed Plugin configuration Secret configuration values Writing plugins Writing one-off plugins Starting an installable plugin using cookiecutter Packaging a plugin Static assets Custom templates Writing plugins that accept configuration Designing URLs for your plugin Building URLs within plugins Plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env, datasette) extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(row, value, column, table, database, datasette) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) register_facet_classes() asgi_wrapper(datasette) startup(datasette) canned_queries(datasette, database, actor) actor_from_request(datasette, request) filters_from_request(request, database, table, datasette) permission_allowed(datasette, actor, action, resource) register_magic_parameters(datasette) forbidden(datasette, request, message) handle_exception(datasette, request, exception) menu_links(datasette, actor, request) table_actions(datasette, actor, database, table, request) database_actions(datasette, actor, database, request) skip_csrf(datasette, scope) get_metadata(datasette, key, database, table) Testing plugins Setting up a Datasette test instance Using pdb for errors thrown inside Datasette Using pytest fixtures Testing outbound HTTP calls with pytest-httpx Registering a plugin for the duration of a test Internals for plugins Request object The MultiParams class Response class Returning a response with .asgi_send(send) Setting cookies with response.set_cookie() Datasette class .databases .plugin_config(plugin_name, database=None, table=None) await .render_template(template, context=None, request=None) await .permission_allowed(actor, action, resource=None, default=False) await .ensure_permissions(actor, permissions) await .check_visibility(actor, action=None, resource=None, permissions=None) .get_database(name) .add_database(db, name=None, route=None) .add_memory_database(name) .remove_database(name) .sign(value, namespace=""default"") .unsign(value, namespace=""default"") .add_message(request, message, type=datasette.INFO) .absolute_url(request, path) .setting(key) datasette.client datasette.urls Database class Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None) db.hash await db.execute(sql, ...) Results await db.execute_fn(fn) await db.execute_write(sql, params=None, block=True) await db.execute_write_script(sql, block=True) await db.execute_write_many(sql, params_seq, block=True) await db.execute_write_fn(fn, block=True) db.close() Database introspection CSRF protection The _internal database The datasette.utils module parse_metadata(content) await_me_maybe(value) Tilde encoding datasette.tracer Tracing child tasks Import shortcuts Contributing General guidelines Setting up a development environment Running the tests Using fixtures Debugging Code formatting Running Black blacken-docs Prettier Editing and building the documentation Running Cog Continuously deployed demo instances Release process Alpha and beta releases Releasing bug fixes from a branch Upgrading CodeMirror Changelog 0.64.6 (2023-12-22) 0.64.5 (2023-10-08) 0.64.4 (2023-09-21) 0.64.3 (2023-04-27) 0.64.2 (2023-03-08) 0.64.1 (2023-01-11) 0.64 (2023-01-09) 0.63.3 (2022-12-17) 0.63.2 (2022-11-18) 0.63.1 (2022-11-10) 0.63 (2022-10-27) Features Plugin hooks and internals Documentation 0.62 (2022-08-14) Features Plugin hooks Bug fixes Documentation 0.61.1 (2022-03-23) 0.61 (2022-03-23) 0.60.2 (2022-02-07) 0.60.1 (2022-01-20) 0.60 (2022-01-13) Plugins and internals Faceting Other small fixes 0.59.4 (2021-11-29) 0.59.3 (2021-11-20) 0.59.2 (2021-11-13) 0.59.1 (2021-10-24) 0.59 (2021-10-14) 0.58.1 (2021-07-16) 0.58 (2021-07-14) 0.57.1 (2021-06-08) 0.57 (2021-06-05) New features Bug fixes and other improvements 0.56.1 (2021-06-05) 0.56 (2021-03-28) 0.55 (2021-02-18) 0.54.1 (2021-02-02) 0.54 (2021-01-25) The _internal database Named in-memory database support JavaScript modules Code formatting with Black and Prettier Other changes 0.53 (2020-12-10) 0.52.5 (2020-12-09) 0.52.4 (2020-12-05) 0.52.3 (2020-12-03) 0.52.2 (2020-12-02) 0.52.1 (2020-11-29) 0.52 (2020-11-28) 0.51.1 (2020-10-31) 0.51 (2020-10-31) New visual design Plugins can now add links within Datasette Binary data URL building Running Datasette behind a proxy Smaller changes 0.50.2 (2020-10-09) 0.50.1 (2020-10-09) 0.50 (2020-10-09) 0.49.1 (2020-09-15) 0.49 (2020-09-14) 0.48 (2020-08-16) 0.47.3 (2020-08-15) 0.47.2 (2020-08-12) 0.47.1 (2020-08-11) 0.47 (2020-08-11) 0.46 (2020-08-09) 0.45 (2020-07-01) Magic parameters for canned queries Log out Better plugin documentation New plugin hooks Smaller changes 0.44 (2020-06-11) Authentication Permissions Writable canned queries Flash messages Signed values and secrets CSRF protection Cookie methods register_routes() plugin hooks Smaller changes The road to Datasette 1.0 0.43 (2020-05-28) 0.42 (2020-05-08) 0.41 (2020-05-06) 0.40 (2020-04-21) 0.39 (2020-03-24) 0.38 (2020-03-08) 0.37.1 (2020-03-02) 0.37 (2020-02-25) 0.36 (2020-02-21) 0.35 (2020-02-04) 0.34 (2020-01-29) 0.33 (2019-12-22) 0.32 (2019-11-14) 0.31.2 (2019-11-13) 0.31.1 (2019-11-12) 0.31 (2019-11-11) 0.30.2 (2019-11-02) 0.30.1 (2019-10-30) 0.30 (2019-10-18) 0.29.3 (2019-09-02) 0.29.2 (2019-07-13) 0.29.1 (2019-07-11) 0.29 (2019-07-07) ASGI New plugin hook: asgi_wrapper New plugin hook: extra_template_vars Secret plugin configuration options Facet by date Easier custom templates for table rows ?_through= for joins through many-to-many tables Small changes 0.28 (2019-05-19) Supporting databases that change Faceting improvements, and faceting plugins datasette publish cloudrun register_output_renderer plugins Medium changes Small changes 0.27.1 (2019-05-09) 0.27 (2019-01-31) 0.26.1 (2019-01-10) 0.26 (2019-01-02) 0.25.2 (2018-12-16) 0.25.1 (2018-11-04) 0.25 (2018-09-19) 0.24 (2018-07-23) 0.23.2 (2018-07-07) 0.23.1 (2018-06-21) 0.23 (2018-06-18) CSV export Foreign key expansions New configuration settings Control HTTP caching with ?_ttl= Improved support for SpatiaLite latest.datasette.io Miscellaneous 0.22.1 (2018-05-23) 0.22 (2018-05-20) 0.21 (2018-05-05) 0.20 (2018-04-20) 0.19 (2018-04-16) 0.18 (2018-04-14) 0.17 (2018-04-13) 0.16 (2018-04-13) 0.15 (2018-04-09) 0.14 (2017-12-09) 0.13 (2017-11-24) 0.12 (2017-11-16) 0.11 (2017-11-14) 0.10 (2017-11-14) 0.9 (2017-11-13) 0.8 (2017-11-13)","[""Datasette""]",[]
full_text_search:configuring-fts-using-sqlite-utils,full_text_search,configuring-fts-using-sqlite-utils,Configuring FTS using sqlite-utils,"sqlite-utils is a CLI utility and Python library for manipulating SQLite databases. You can use it from Python code to configure FTS search, or you can achieve the same goal using the accompanying command-line tool .
Here's how to use sqlite-utils to enable full-text search for an items table across the name and description columns:
$ sqlite-utils enable-fts mydatabase.db items name description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/python-api.html#enabling-full-text-search"", ""label"": ""it from Python code""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/cli.html#configuring-full-text-search"", ""label"": ""using the accompanying command-line tool""}]"
full_text_search:configuring-fts-using-csvs-to-sqlite,full_text_search,configuring-fts-using-csvs-to-sqlite,Configuring FTS using csvs-to-sqlite,"If your data starts out in CSV files, you can use Datasette's companion tool csvs-to-sqlite to convert that file into a SQLite database and enable full-text search on specific columns. For a file called items.csv where you want full-text search to operate against the name and description columns you would run the following:
$ csvs-to-sqlite items.csv items.db -f name -f description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://github.com/simonw/csvs-to-sqlite"", ""label"": ""csvs-to-sqlite""}]"
full_text_search:configuring-fts-by-hand,full_text_search,configuring-fts-by-hand,Configuring FTS by hand,"We recommend using sqlite-utils , but if you want to hand-roll a SQLite full-text search table you can do so using the following SQL.
To enable full-text search for a table called items that works against the name and description columns, you would run this SQL to create a new items_fts FTS virtual table:
CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 (
name,
description,
content=""items""
);
This creates a set of tables to power full-text search against items . The new items_fts table will be detected by Datasette as the fts_table for the items table.
Creating the table is not enough: you also need to populate it with a copy of the data that you wish to make searchable. You can do that using the following SQL:
INSERT INTO ""items_fts"" (rowid, name, description)
SELECT rowid, name, description FROM items;
If your table has columns that are foreign key references to other tables you can include that data in your full-text search index using a join. Imagine the items table has a foreign key column called category_id which refers to a categories table - you could create a full-text search table like this:
CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 (
name,
description,
category_name,
content=""items""
);
And then populate it like this:
INSERT INTO ""items_fts"" (rowid, name, description, category_name)
SELECT items.rowid,
items.name,
items.description,
categories.name
FROM items JOIN categories ON items.category_id=categories.id;
You can use this technique to populate the full-text search index from any combination of tables and joins that makes sense for your project.","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}]"
settings:config-dir,settings,config-dir,Configuration directory mode,"Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose:
$ datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
As an alternative to this, you can run Datasette in configuration directory mode. Create a directory with the following structure:
# In a directory called my-app:
my-app/one.db
my-app/two.db
my-app/metadata.json
my-app/templates/index.html
my-app/plugins/my_plugin.py
my-app/static/my.css
Now start Datasette by providing the path to that directory:
$ datasette my-app/
Datasette will detect the files in that directory and automatically configure itself using them. It will serve all *.db files that it finds, will load metadata.json if it exists, and will load the templates , plugins and static folders if they are present.
The files that can be included in this directory are as follows. All are optional.
*.db (or *.sqlite3 or *.sqlite ) - SQLite database files that will be served by Datasette
metadata.json - Metadata for those databases - metadata.yaml or metadata.yml can be used as well
inspect-data.json - the result of running datasette inspect *.db --inspect-file=inspect-data.json from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running
settings.json - settings that would normally be passed using --setting - here they should be stored as a JSON object of key/value pairs
templates/ - a directory containing Custom templates
plugins/ - a directory containing plugins, see Writing one-off plugins
static/ - a directory containing static files - these will be served from /static/filename.txt , see Serving static files","[""Settings""]",[]
json_api:column-filter-arguments,json_api,column-filter-arguments,Column filter arguments,"You can filter the data returned by the table based on column values using a query string argument.
?column__exact=value or ?_column=value
Returns rows where the specified column exactly matches the value.
?column__not=value
Returns rows where the column does not match the value.
?column__contains=value
Rows where the string column contains the specified value ( column like ""%value%"" in SQL).
?column__endswith=value
Rows where the string column ends with the specified value ( column like ""%value"" in SQL).
?column__startswith=value
Rows where the string column starts with the specified value ( column like ""value%"" in SQL).
?column__gt=value
Rows which are greater than the specified value.
?column__gte=value
Rows which are greater than or equal to the specified value.
?column__lt=value
Rows which are less than the specified value.
?column__lte=value
Rows which are less than or equal to the specified value.
?column__like=value
Match rows with a LIKE clause, case insensitive and with % as the wildcard character.
?column__notlike=value
Match rows that do not match the provided LIKE clause.
?column__glob=value
Similar to LIKE but uses Unix wildcard syntax and is case sensitive.
?column__in=value1,value2,value3
Rows where column matches any of the provided values.
You can use a comma separated string, or you can use a JSON array.
The JSON array option is useful if one of your matching values itself contains a comma:
?column__in=[""value"",""value,with,commas""]
?column__notin=value1,value2,value3
Rows where column does not match any of the provided values. The inverse of __in= . Also supports JSON arrays.
?column__arraycontains=value
Works against columns that contain JSON arrays - matches if any of the values in that array match the provided value.
This is only available if the json1 SQLite extension is enabled.
?column__arraynotcontains=value
Works against columns that contain JSON arrays - matches if none of the values in that array match the provided value.
This is only available if the json1 SQLite extension is enabled.
?column__date=value
Column is a datestamp occurring on the specified YYYY-MM-DD date, e.g. 2018-01-02 .
?column__isnull=1
Matches rows where the column is null.
?column__notnull=1
Matches rows where the column is not null.
?column__isblank=1
Matches rows where the column is blank, meaning null or the empty string.
?column__notblank=1
Matches rows where the column is not blank.","[""JSON API"", ""Table arguments""]",[]
changelog:code-formatting-with-black-and-prettier,changelog,code-formatting-with-black-and-prettier,Code formatting with Black and Prettier,"Datasette adopted Black for opinionated Python code formatting in June 2019. Datasette now also embraces Prettier for JavaScript formatting, which like Black is enforced by tests in continuous integration. Instructions for using these two tools can be found in the new section on Code formatting in the contributors documentation. ( #1167 )","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/psf/black"", ""label"": ""Black""}, {""href"": ""https://prettier.io/"", ""label"": ""Prettier""}, {""href"": ""https://github.com/simonw/datasette/issues/1167"", ""label"": ""#1167""}]"
publish:cli-publish,publish,cli-publish,datasette publish,"Once you have created a SQLite database (e.g. using csvs-to-sqlite ) you can deploy it to a hosting account using a single command.
You will need a hosting account with Heroku or Google Cloud . Once you have created your account you will need to install and configure the heroku or gcloud command-line tools.","[""Publishing data""]","[{""href"": ""https://github.com/simonw/csvs-to-sqlite/"", ""label"": ""csvs-to-sqlite""}, {""href"": ""https://www.heroku.com/"", ""label"": ""Heroku""}, {""href"": ""https://cloud.google.com/"", ""label"": ""Google Cloud""}]"
publish:cli-package,publish,cli-package,datasette package,"If you have docker installed (e.g. using Docker for Mac ) you can use the datasette package command to create a new Docker image in your local repository containing the datasette app bundled together with one or more SQLite databases:
datasette package mydatabase.db
Here's example output for the package command:
$ datasette package parlgov.db --extra-options=""--setting sql_time_limit_ms 2500""
Sending build context to Docker daemon 4.459MB
Step 1/7 : FROM python:3.11.0-slim-bullseye
---> 79e1dc9af1c1
Step 2/7 : COPY . /app
---> Using cache
---> cd4ec67de656
Step 3/7 : WORKDIR /app
---> Using cache
---> 139699e91621
Step 4/7 : RUN pip install datasette
---> Using cache
---> 340efa82bfd7
Step 5/7 : RUN datasette inspect parlgov.db --inspect-file inspect-data.json
---> Using cache
---> 5fddbe990314
Step 6/7 : EXPOSE 8001
---> Using cache
---> 8e83844b0fed
Step 7/7 : CMD datasette serve parlgov.db --port 8001 --inspect-file inspect-data.json --setting sql_time_limit_ms 2500
---> Using cache
---> 1bd380ea8af3
Successfully built 1bd380ea8af3
You can now run the resulting container like so:
docker run -p 8081:8001 1bd380ea8af3
This exposes port 8001 inside the container as port 8081 on your host machine, so you can access the application at http://localhost:8081/
You can customize the port that is exposed by the container using the --port option:
datasette package mydatabase.db --port 8080
A full list of options can be seen by running datasette package --help :
See datasette package for the full list of options for this command.","[""Publishing data""]","[{""href"": ""https://www.docker.com/docker-mac"", ""label"": ""Docker for Mac""}]"
cli-reference:cli-help-uninstall-help,cli-reference,cli-help-uninstall-help,datasette uninstall,"Uninstall one or more plugins.
[[[cog
help([""uninstall"", ""--help""])
]]]
Usage: datasette uninstall [OPTIONS] PACKAGES...
Uninstall plugins and Python packages from the Datasette environment
Options:
-y, --yes Don't ask for confirmation
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-serve-help-settings,cli-reference,cli-help-serve-help-settings,datasette serve --help-settings,"This command outputs all of the available Datasette settings .
These can be passed to datasette serve using datasette serve --setting name value .
[[[cog
help([""--help-settings""])
]]]
Settings:
default_page_size Default page size for the table view
(default=100)
max_returned_rows Maximum rows that can be returned from a table or
custom query (default=1000)
num_sql_threads Number of threads in the thread pool for
executing SQLite queries (default=3)
sql_time_limit_ms Time limit for a SQL query in milliseconds
(default=1000)
default_facet_size Number of values to return for requested facets
(default=30)
facet_time_limit_ms Time limit for calculating a requested facet
(default=200)
facet_suggest_time_limit_ms Time limit for calculating a suggested facet
(default=50)
allow_facet Allow users to specify columns to facet using
?_facet= parameter (default=True)
default_allow_sql Allow anyone to run arbitrary SQL queries
(default=True)
allow_download Allow users to download the original SQLite
database files (default=True)
suggest_facets Calculate and display suggested facets
(default=True)
default_cache_ttl Default HTTP cache TTL (used in Cache-Control:
max-age= header) (default=5)
cache_size_kb SQLite cache size in KB (0 == use SQLite default)
(default=0)
allow_csv_stream Allow .csv?_stream=1 to download all rows
(ignoring max_returned_rows) (default=True)
max_csv_mb Maximum size allowed for CSV export in MB - set 0
to disable this limit (default=100)
truncate_cells_html Truncate cells longer than this in HTML table
view - set 0 to disable (default=2048)
force_https_urls Force URLs in API output to always use https://
protocol (default=False)
template_debug Allow display of template debug information with
?_context=1 (default=False)
trace_debug Allow display of SQL trace debug information with
?_trace=1 (default=False)
base_url Datasette URLs should use this base path
(default=/)
[[[end]]]","[""CLI reference"", ""datasette serve""]",[]
cli-reference:cli-help-serve-help,cli-reference,cli-help-serve-help,datasette serve,"This command starts the Datasette web application running on your machine:
datasette serve mydatabase.db
Or since this is the default command you can run this instead:
datasette mydatabase.db
Once started you can access it at http://localhost:8001
[[[cog
help([""serve"", ""--help""])
]]]
Usage: datasette serve [OPTIONS] [FILES]...
Serve up specified SQLite database files with a web UI
Options:
-i, --immutable PATH Database files to open in immutable mode
-h, --host TEXT Host for server. Defaults to 127.0.0.1 which
means only connections from the local machine
will be allowed. Use 0.0.0.0 to listen to all
IPs and allow access from other machines.
-p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to
automatically assign an available port.
[0<=x<=65535]
--uds TEXT Bind to a Unix domain socket
--reload Automatically reload if code or metadata
change detected - useful for development
--cors Enable CORS by serving Access-Control-Allow-
Origin: *
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--inspect-file TEXT Path to JSON file created using ""datasette
inspect""
-m, --metadata FILENAME Path to JSON/YAML file containing
license/source metadata
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--memory Make /_memory database available
--config CONFIG Deprecated: set config option using
configname:value. Use --setting instead.
--setting SETTING... Setting, see
docs.datasette.io/en/stable/settings.html
--secret TEXT Secret used for signing secure values, such as
signed cookies
--root Output URL that sets a cookie authenticating
the root user
--get TEXT Run an HTTP GET request against this path,
print results and exit
--version-note TEXT Additional note to show on /-/versions
--help-settings Show available settings
--pdb Launch debugger on any errors
-o, --open Open Datasette in your web browser
--create Create database files if they do not exist
--crossdb Enable cross-database joins using the /_memory
database
--nolock Ignore locking, open locked files in read-only
mode
--ssl-keyfile TEXT SSL key file
--ssl-certfile TEXT SSL certificate file
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-publish-heroku-help,cli-reference,cli-help-publish-heroku-help,datasette publish heroku,"See Publishing to Heroku .
[[[cog
help([""publish"", ""heroku"", ""--help""])
]]]
Usage: datasette publish heroku [OPTIONS] [FILES]...
Publish databases to Datasette running on Heroku
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when deploying
--tar TEXT --tar option to pass to Heroku, e.g.
--tar=/usr/local/bin/gtar
--generate-dir DIRECTORY Output generated application files and stop
without deploying
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-publish-help,cli-reference,cli-help-publish-help,datasette publish,"Shows a list of available deployment targets for publishing data with Datasette.
Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook.
[[[cog
help([""publish"", ""--help""])
]]]
Usage: datasette publish [OPTIONS] COMMAND [ARGS]...
Publish specified SQLite database files to the internet along with a
Datasette-powered interface and API
Options:
--help Show this message and exit.
Commands:
cloudrun Publish databases to Datasette running on Cloud Run
heroku Publish databases to Datasette running on Heroku
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-publish-cloudrun-help,cli-reference,cli-help-publish-cloudrun-help,datasette publish cloudrun,"See Publishing to Google Cloud Run .
[[[cog
help([""publish"", ""cloudrun"", ""--help""])
]]]
Usage: datasette publish cloudrun [OPTIONS] [FILES]...
Publish databases to Datasette running on Cloud Run
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when building
--service TEXT Cloud Run service to deploy (or over-write)
--spatialite Enable SpatialLite extension
--show-files Output the generated Dockerfile and
metadata.json
--memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi
--cpu [1|2|4] Number of vCPUs to allocate in Cloud Run
--timeout INTEGER Build timeout in seconds
--apt-get-install TEXT Additional packages to apt-get install
--max-instances INTEGER Maximum Cloud Run instances
--min-instances INTEGER Minimum Cloud Run instances
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-plugins-help,cli-reference,cli-help-plugins-help,datasette plugins,"Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use.
[[[cog
help([""plugins"", ""--help""])
]]]
Usage: datasette plugins [OPTIONS]
List currently installed plugins
Options:
--all Include built-in default plugins
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
[[[end]]]
Example output:
[
{
""name"": ""datasette-geojson"",
""static"": false,
""templates"": false,
""version"": ""0.3.1"",
""hooks"": [
""register_output_renderer""
]
},
{
""name"": ""datasette-geojson-map"",
""static"": true,
""templates"": false,
""version"": ""0.4.0"",
""hooks"": [
""extra_body_script"",
""extra_css_urls"",
""extra_js_urls""
]
},
{
""name"": ""datasette-leaflet"",
""static"": true,
""templates"": false,
""version"": ""0.2.2"",
""hooks"": [
""extra_body_script"",
""extra_template_vars""
]
}
]","[""CLI reference""]",[]
cli-reference:cli-help-package-help,cli-reference,cli-help-package-help,datasette package,"Package SQLite files into a Datasette Docker container, see datasette package .
[[[cog
help([""package"", ""--help""])
]]]
Usage: datasette package [OPTIONS] FILES...
Package SQLite files into a Datasette Docker container
Options:
-t, --tag TEXT Name for the resulting Docker container, can
optionally use name:tag format
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g. main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--spatialite Enable SpatialLite extension
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
-p, --port INTEGER RANGE Port to run the server on, defaults to 8001
[1<=x<=65535]
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-install-help,cli-reference,cli-help-install-help,datasette install,"Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette.
This command:
datasette install datasette-cluster-map
Would install the datasette-cluster-map plugin.
[[[cog
help([""install"", ""--help""])
]]]
Usage: datasette install [OPTIONS] PACKAGES...
Install plugins and packages from PyPI into the same environment as Datasette
Options:
-U, --upgrade Upgrade packages to latest version
--help Show this message and exit.
[[[end]]]","[""CLI reference""]","[{""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}]"
cli-reference:cli-help-inspect-help,cli-reference,cli-help-inspect-help,datasette inspect,"Outputs JSON representing introspected data about one or more SQLite database files.
If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running:
datasette inspect mydatabase.db > inspect-data.json
datasette serve -i mydatabase.db --inspect-file inspect-data.json
This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually.
[[[cog
help([""inspect"", ""--help""])
]]]
Usage: datasette inspect [OPTIONS] [FILES]...
Generate JSON summary of provided database files
This can then be passed to ""datasette --inspect-file"" to speed up count
operations against immutable database files.
Options:
--inspect-file TEXT
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--help Show this message and exit.
[[[end]]]","[""CLI reference""]",[]
cli-reference:cli-help-help,cli-reference,cli-help-help,datasette --help,"Running datasette --help shows a list of all of the available commands.
[[[cog
help([""--help""])
]]]
Usage: datasette [OPTIONS] COMMAND [ARGS]...
Datasette is an open source multi-tool for exploring and publishing data
About Datasette: https://datasette.io/
Full documentation: https://docs.datasette.io/
Options:
--version Show the version and exit.
--help Show this message and exit.
Commands:
serve* Serve up specified SQLite database files with a web UI
inspect Generate JSON summary of provided database files
install Install plugins and packages from PyPI into the same...
package Package SQLite files into a Datasette Docker container
plugins List currently installed plugins
publish Publish specified SQLite database files to the internet along...
uninstall Uninstall plugins and Python packages from the Datasette...
[[[end]]]
Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.","[""CLI reference""]",[]
cli-reference:cli-datasette-get,cli-reference,cli-datasette-get,datasette --get,"The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server.
This means that all of Datasette's functionality can be accessed directly from the command-line.
For example:
$ datasette --get '/-/versions.json' | jq .
{
""python"": {
""version"": ""3.8.5"",
""full"": ""3.8.5 (default, Jul 21 2020, 10:48:26) \n[Clang 11.0.3 (clang-1103.0.32.62)]""
},
""datasette"": {
""version"": ""0.46+15.g222a84a.dirty""
},
""asgi"": ""3.0"",
""uvicorn"": ""0.11.8"",
""sqlite"": {
""version"": ""3.32.3"",
""fts_versions"": [
""FTS5"",
""FTS4"",
""FTS3""
],
""extensions"": {
""json1"": null
},
""compile_options"": [
""COMPILER=clang-11.0.3"",
""ENABLE_COLUMN_METADATA"",
""ENABLE_FTS3"",
""ENABLE_FTS3_PARENTHESIS"",
""ENABLE_FTS4"",
""ENABLE_FTS5"",
""ENABLE_GEOPOLY"",
""ENABLE_JSON1"",
""ENABLE_PREUPDATE_HOOK"",
""ENABLE_RTREE"",
""ENABLE_SESSION"",
""MAX_VARIABLE_NUMBER=250000"",
""THREADSAFE=1""
]
}
}
The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.","[""CLI reference"", ""datasette serve""]",[]
sql_queries:canned-queries-writable,sql_queries,canned-queries-writable,Writable canned queries,"Canned queries by default are read-only. You can use the ""write"": true key to indicate that a canned query can write to the database.
See Controlling access to specific canned queries for details on how to add permission checks to canned queries, using the ""allow"" key.
{
""databases"": {
""mydatabase"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true
}
}
}
}
}
This configuration will create a page at /mydatabase/add_name displaying a form with a name field. Submitting that form will execute the configured INSERT query.
You can customize how Datasette represents success and errors using the following optional properties:
on_success_message - the message shown when a query is successful
on_success_redirect - the path or URL the user is redirected to on success
on_error_message - the message shown when a query throws an error
on_error_redirect - the path or URL the user is redirected to on error
For example:
{
""databases"": {
""mydatabase"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true,
""on_success_message"": ""Name inserted"",
""on_success_redirect"": ""/mydatabase/names"",
""on_error_message"": ""Name insert failed"",
""on_error_redirect"": ""/mydatabase""
}
}
}
}
}
You can use ""params"" to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected.
You can pre-populate form fields when the page first loads using a query string, e.g. /mydatabase/add_name?name=Prepopulated . The user will have to submit the form to execute the query.","[""Running SQL queries"", ""Canned queries""]",[]
sql_queries:canned-queries-options,sql_queries,canned-queries-options,Additional canned query options,Additional options can be specified for canned queries in the YAML or JSON configuration.,"[""Running SQL queries"", ""Canned queries""]",[]
sql_queries:canned-queries-named-parameters,sql_queries,canned-queries-named-parameters,Canned query parameters,"Canned queries support named parameters, so if you include those in the SQL you will then be able to enter them using the form fields on the canned query page or by adding them to the URL. This means canned queries can be used to create custom JSON APIs based on a carefully designed SQL statement.
Here's an example of a canned query with a named parameter:
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood;
In the canned query metadata (here Using YAML for metadata as metadata.yaml ) it looks like this:
databases:
fixtures:
queries:
neighborhood_search:
sql: |-
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
title: Search neighborhoods
Here's the equivalent using JSON (as metadata.json ):
{
""databases"": {
""fixtures"": {
""queries"": {
""neighborhood_search"": {
""sql"": ""select neighborhood, facet_cities.name, state\nfrom facetable\n join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%'\norder by neighborhood"",
""title"": ""Search neighborhoods""
}
}
}
}
}
Note that we are using SQLite string concatenation here - the || operator - to add wildcard % characters to the string provided by the user.
You can try this canned query out here:
https://latest.datasette.io/fixtures/neighborhood_search?text=town
In this example the :text named parameter is automatically extracted from the query using a regular expression.
You can alternatively provide an explicit list of named parameters using the ""params"" key, like this:
databases:
fixtures:
queries:
neighborhood_search:
params:
- text
sql: |-
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
title: Search neighborhoods","[""Running SQL queries"", ""Canned queries""]","[{""href"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town"", ""label"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town""}]"
sql_queries:canned-queries-magic-parameters,sql_queries,canned-queries-magic-parameters,Magic parameters,"Named parameters that start with an underscore are special: they can be used to automatically add values created by Datasette that are not contained in the incoming form fields or query string.
These magic parameters are only supported for canned queries: to avoid security issues (such as queries that extract the user's private cookies) they are not available to SQL that is executed by the user as a custom SQL query.
Available magic parameters are:
_actor_* - e.g. _actor_id , _actor_name
Fields from the currently authenticated Actors .
_header_* - e.g. _header_user_agent
Header from the incoming HTTP request. The key should be in lower case and with hyphens converted to underscores e.g. _header_user_agent or _header_accept_language .
_cookie_* - e.g. _cookie_lang
The value of the incoming cookie of that name.
_now_epoch
The number of seconds since the Unix epoch.
_now_date_utc
The date in UTC, e.g. 2020-06-01
_now_datetime_utc
The ISO 8601 datetime in UTC, e.g. 2020-06-24T18:01:07Z
_random_chars_* - e.g. _random_chars_128
A random string of characters of the specified length.
Here's an example configuration (this time using metadata.yaml since that provides better support for multi-line SQL queries) that adds a message from the authenticated user, storing various pieces of additional metadata using magic parameters:
databases:
mydatabase:
queries:
add_message:
allow:
id: ""*""
sql: |-
INSERT INTO messages (
user_id, message, datetime
) VALUES (
:_actor_id, :message, :_now_datetime_utc
)
write: true
The form presented at /mydatabase/add_message will have just a field for message - the other parameters will be populated by the magic parameter mechanism.
Additional custom magic parameters can be added by plugins using the register_magic_parameters(datasette) hook.","[""Running SQL queries"", ""Canned queries""]",[]
sql_queries:canned-queries-json-api,sql_queries,canned-queries-json-api,JSON API for writable canned queries,"Writable canned queries can also be accessed using a JSON API. You can POST data to them using JSON, and you can request that their response is returned to you as JSON.
To submit JSON to a writable canned query, encode key/value parameters as a JSON document:
POST /mydatabase/add_message
{""message"": ""Message goes here""}
You can also continue to submit data using regular form encoding, like so:
POST /mydatabase/add_message
message=Message+goes+here
There are three options for specifying that you would like the response to your request to return JSON data, as opposed to an HTTP redirect to another page.
Set an Accept: application/json header on your request
Include ?_json=1 in the URL that you POST to
Include ""_json"": 1 in your JSON body, or &_json=1 in your form encoded body
The JSON response will look like this:
{
""ok"": true,
""message"": ""Query executed, 1 row affected"",
""redirect"": ""/data/add_name""
}
The ""message"" and ""redirect"" values here will take into account on_success_message , on_success_redirect , on_error_message and on_error_redirect , if they have been set.","[""Running SQL queries"", ""Canned queries""]",[]
changelog:bug-fixes-and-other-improvements,changelog,bug-fixes-and-other-improvements,Bug fixes and other improvements,"Custom pages now work correctly when combined with the base_url setting. ( #1238 )
Fixed intermittent error displaying the index page when the user did not have permission to access one of the tables. Thanks, Guy Freeman. ( #1305 )
Columns with the name ""Link"" are no longer incorrectly displayed in bold. ( #1308 )
Fixed error caused by tables with a single quote in their names. ( #1257 )
Updated dependencies: pytest-asyncio , Black , jinja2 , aiofiles , click , and itsdangerous .
The official Datasette Docker image now supports apt-get install . ( #1320 )
The Heroku runtime used by datasette publish heroku is now python-3.8.10 .","[""Changelog"", ""0.57 (2021-06-05)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1238"", ""label"": ""#1238""}, {""href"": ""https://github.com/simonw/datasette/issues/1305"", ""label"": ""#1305""}, {""href"": ""https://github.com/simonw/datasette/issues/1308"", ""label"": ""#1308""}, {""href"": ""https://github.com/simonw/datasette/issues/1257"", ""label"": ""#1257""}, {""href"": ""https://github.com/simonw/datasette/issues/1320"", ""label"": ""#1320""}]"
changelog:bug-fixes,changelog,bug-fixes,Bug fixes,"Don't show the facet option in the cog menu if faceting is not allowed. ( #1683 )
?_sort and ?_sort_desc now work if the column that is being sorted has been excluded from the query using ?_col= or ?_nocol= . ( #1773 )
Fixed bug where ?_sort_desc was duplicated in the URL every time the Apply button was clicked. ( #1738 )","[""Changelog"", ""0.62 (2022-08-14)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1683"", ""label"": ""#1683""}, {""href"": ""https://github.com/simonw/datasette/issues/1773"", ""label"": ""#1773""}, {""href"": ""https://github.com/simonw/datasette/issues/1738"", ""label"": ""#1738""}]"
binary_data:binary-plugins,binary_data,binary-plugins,Binary plugins,"Several Datasette plugins are available that change the way Datasette treats binary data.
datasette-render-binary modifies Datasette's default interface to show an automatic guess at what type of binary data is being stored, along with a visual representation of the binary value that displays ASCII strings directly in the interface.
datasette-render-images detects common image formats and renders them as images directly in the Datasette interface.
datasette-media allows Datasette interfaces to be configured to serve binary files from configured SQL queries, and includes the ability to resize images directly before serving them.","[""Binary data""]","[{""href"": ""https://github.com/simonw/datasette-render-binary"", ""label"": ""datasette-render-binary""}, {""href"": ""https://github.com/simonw/datasette-render-images"", ""label"": ""datasette-render-images""}, {""href"": ""https://github.com/simonw/datasette-media"", ""label"": ""datasette-media""}]"
binary_data:binary-linking,binary_data,binary-linking,Linking to binary downloads,"The .blob output format is used to return binary data. It requires a _blob_column= query string argument specifying which BLOB column should be downloaded, for example:
https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data
This output format can also be used to return binary data from an arbitrary SQL query. Since such queries do not specify an exact row, an additional ?_blob_hash= parameter can be used to specify the SHA-256 hash of the value that is being linked to.
Consider the query select data from binary_data - demonstrated here .
That page links to the binary value downloads. Those links look like this:
https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d
These .blob links are also returned in the .csv exports Datasette provides for binary tables and queries, since the CSV format does not have a mechanism for representing binary data.","[""Binary data""]","[{""href"": ""https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data"", ""label"": ""https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data""}, {""href"": ""https://latest.datasette.io/fixtures?sql=select+data+from+binary_data"", ""label"": ""demonstrated here""}, {""href"": ""https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d"", ""label"": ""https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d""}]"
changelog:binary-data,changelog,binary-data,Binary data,"SQLite tables can contain binary data in BLOB columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. ( #1036 and #1034 ).","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1036"", ""label"": ""#1036""}, {""href"": ""https://github.com/simonw/datasette/issues/1034"", ""label"": ""#1034""}]"
binary_data:binary,binary_data,binary,Binary data,"SQLite tables can contain binary data in BLOB columns.
Datasette includes special handling for these binary values. The Datasette interface detects binary values and provides a link to download their content, for example on https://latest.datasette.io/fixtures/binary_data
Binary data is represented in .json exports using Base64 encoding.
https://latest.datasette.io/fixtures/binary_data.json?_shape=array
[
{
""rowid"": 1,
""data"": {
""$base64"": true,
""encoded"": ""FRwCx60F/g==""
}
},
{
""rowid"": 2,
""data"": {
""$base64"": true,
""encoded"": ""FRwDx60F/g==""
}
},
{
""rowid"": 3,
""data"": null
}
]",[],"[{""href"": ""https://latest.datasette.io/fixtures/binary_data"", ""label"": ""https://latest.datasette.io/fixtures/binary_data""}, {""href"": ""https://latest.datasette.io/fixtures/binary_data.json?_shape=array"", ""label"": ""https://latest.datasette.io/fixtures/binary_data.json?_shape=array""}]"
changelog:better-plugin-documentation,changelog,better-plugin-documentation,Better plugin documentation,"The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 )
Plugins introduces Datasette's plugin system and describes how to install and configure plugins.
Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template.
Plugin hooks is a full list of detailed documentation for every Datasette plugin hook.
Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/687"", ""label"": ""#687""}, {""href"": ""https://github.com/simonw/datasette-plugin"", ""label"": ""datasette-plugin""}, {""href"": ""https://docs.pytest.org/"", ""label"": ""pytest""}, {""href"": ""https://www.python-httpx.org/"", ""label"": ""HTTPX""}]"
authentication:authentication-root,authentication,authentication-root,"Using the ""root"" actor","Datasette currently leaves almost all forms of authentication to plugins - datasette-auth-github for example.
The one exception is the ""root"" account, which you can sign into while using Datasette on your local machine. This provides access to a small number of debugging features.
To sign in as root, start Datasette using the --root command-line option, like this:
$ datasette --root
http://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7
INFO: Started server process [25801]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
The URL on the first line includes a one-use token which can be used to sign in as the ""root"" actor in your browser. Click on that link and then visit http://127.0.0.1:8001/-/actor to confirm that you are authenticated as an actor that looks like this:
{
""id"": ""root""
}","[""Authentication and permissions"", ""Actors""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}]"
authentication:authentication-permissions-table,authentication,authentication-permissions-table,Controlling access to specific tables and views,"To limit access to the users table in your bakery.db database:
{
""databases"": {
""bakery"": {
""tables"": {
""users"": {
""allow"": {
""id"": ""*""
}
}
}
}
}
}
This works for SQL views as well - you can list their names in the ""tables"" block above in the same way as regular tables.
Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, like this for example.
If you are restricting access to specific tables you should also use the ""allow_sql"" block to prevent users from bypassing the limit with their own SQL queries - see Controlling the ability to execute arbitrary SQL .","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]","[{""href"": ""https://latest.datasette.io/fixtures?sql=select+*+from+facetable"", ""label"": ""like this""}]"
authentication:authentication-permissions-query,authentication,authentication-permissions-query,Controlling access to specific canned queries,"Canned queries allow you to configure named SQL queries in your metadata.json that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important.
To limit access to the add_name canned query in your dogs.db database to just the root user :
{
""databases"": {
""dogs"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true,
""allow"": {
""id"": [""root""]
}
}
}
}
}
}","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]",[]
authentication:authentication-permissions-metadata,authentication,authentication-permissions-metadata,Configuring permissions in metadata.json,"You can limit who is allowed to view different parts of your Datasette instance using ""allow"" keys in your Metadata configuration.
You can control the following:
Access to the entire Datasette instance
Access to specific databases
Access to specific tables and views
Access to specific Canned queries
If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.","[""Authentication and permissions""]",[]
authentication:authentication-permissions-instance,authentication,authentication-permissions-instance,Controlling access to an instance,"Here's how to restrict access to your entire Datasette instance to just the ""id"": ""root"" user:
{
""title"": ""My private Datasette instance"",
""allow"": {
""id"": ""root""
}
}
To deny access to all users, you can use ""allow"": false :
{
""title"": ""My entirely inaccessible instance"",
""allow"": false
}
One reason to do this is if you are using a Datasette plugin - such as datasette-permissions-sql - to control permissions instead.","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]","[{""href"": ""https://github.com/simonw/datasette-permissions-sql"", ""label"": ""datasette-permissions-sql""}]"
authentication:authentication-permissions-execute-sql,authentication,authentication-permissions-execute-sql,Controlling the ability to execute arbitrary SQL,"Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on the database page or by appending a ?_where= parameter to the table page like this .
Access to this ability is controlled by the execute-sql permission.
The easiest way to disable arbitrary SQL queries is using the default_allow_sql setting when you first start Datasette running.
You can alternatively use an ""allow_sql"" block to control who is allowed to execute arbitrary SQL queries.
To prevent any user from executing arbitrary SQL queries, use this:
{
""allow_sql"": false
}
To enable just the root user to execute SQL for all databases in your instance, use the following:
{
""allow_sql"": {
""id"": ""root""
}
}
To limit this ability for just one specific database, use this:
{
""databases"": {
""mydatabase"": {
""allow_sql"": {
""id"": ""root""
}
}
}
}","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]","[{""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""the database page""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_city_id=1"", ""label"": ""like this""}]"
authentication:authentication-permissions-database,authentication,authentication-permissions-database,Controlling access to specific databases,"To limit access to a specific private.db database to just authenticated users, use the ""allow"" block like this:
{
""databases"": {
""private"": {
""allow"": {
""id"": ""*""
}
}
}
}","[""Authentication and permissions"", ""Configuring permissions in metadata.json""]",[]
authentication:authentication-permissions-allow,authentication,authentication-permissions-allow,"Defining permissions with ""allow"" blocks","The standard way to define permissions in Datasette is to use an ""allow"" block. This is a JSON document describing which actors are allowed to perform a permission.
The most basic form of allow block is this ( allow demo , deny demo ):
{
""allow"": {
""id"": ""root""
}
}
This will match any actors with an ""id"" property of ""root"" - for example, an actor that looks like this:
{
""id"": ""root"",
""name"": ""Root User""
}
An allow block can specify ""deny all"" using false ( demo ):
{
""allow"": false
}
An ""allow"" of true allows all access ( demo ):
{
""allow"": true
}
Allow keys can provide a list of values. These will match any actor that has any of those values ( allow demo , deny demo ):
{
""allow"": {
""id"": [""simon"", ""cleopaws""]
}
}
This will match any actor with an ""id"" of either ""simon"" or ""cleopaws"" .
Actors can have properties that feature a list of values. These will be matched against the list of values in an allow block. Consider the following actor:
{
""id"": ""simon"",
""roles"": [""staff"", ""developer""]
}
This allow block will provide access to any actor that has ""developer"" as one of their roles ( allow demo , deny demo ):
{
""allow"": {
""roles"": [""developer""]
}
}
Note that ""roles"" is not a concept that is baked into Datasette - it's a convention that plugins can choose to implement and act on.
If you want to provide access to any actor with a value for a specific key, use ""*"" . For example, to match any logged-in user specify the following ( allow demo , deny demo ):
{
""allow"": {
""id"": ""*""
}
}
You can specify that only unauthenticated actors (from anynomous HTTP requests) should be allowed access using the special ""unauthenticated"": true key in an allow block ( allow demo , deny demo ):
{
""allow"": {
""unauthenticated"": true
}
}
Allow keys act as an ""or"" mechanism. An actor will be able to execute the query if any of their JSON properties match any of the values in the corresponding lists in the allow block. The following block will allow users with either a role of ""ops"" OR users who have an id of ""simon"" or ""cleopaws"" :
{
""allow"": {
""id"": [""simon"", ""cleopaws""],
""role"": ""ops""
}
}
Demo for cleopaws , demo for ops role , demo for an actor matching neither rule .","[""Authentication and permissions"", ""Permissions""]","[{""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D"", ""label"": ""allow demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D"", ""label"": ""deny demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=false"", ""label"": ""demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=true"", ""label"": ""demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D"", ""label"": ""allow demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22pancakes%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D"", ""label"": ""deny demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%2C%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22staff%22%2C%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D"", ""label"": ""allow demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%2C%0D%0A++++%22roles%22%3A+%5B%22dog%22%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D"", ""label"": ""deny demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D"", ""label"": ""allow demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22bot%22%3A+%22readme-bot%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D"", ""label"": ""deny demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=null&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D"", ""label"": ""allow demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22hello%22%0D%0A%7D&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D"", ""label"": ""deny demo""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D"", ""label"": ""Demo for cleopaws""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22trevor%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22ops%22%2C%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D"", ""label"": ""demo for ops role""}, {""href"": ""https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22percy%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D"", ""label"": ""demo for an actor matching neither rule""}]"
authentication:authentication-permissions,authentication,authentication-permissions,Permissions,"Datasette has an extensive permissions system built-in, which can be further extended and customized by plugins.
The key question the permissions system answers is this:
Is this actor allowed to perform this action , optionally against this particular resource ?
Actors are described above .
An action is a string describing the action the actor would like to perform. A full list is provided below - examples include view-table and execute-sql .
A resource is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as permissions-debug , are not associated with a particular resource.
Datasette's built-in view permissions ( view-database , view-table etc) default to allow - unless you configure additional permission rules unauthenticated users will be allowed to access content.
Permissions with potentially harmful effects should default to deny . Plugin authors should account for this when designing new plugins - for example, the datasette-upload-csvs plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.","[""Authentication and permissions""]","[{""href"": ""https://github.com/simonw/datasette-upload-csvs"", ""label"": ""datasette-upload-csvs""}]"
authentication:authentication-ds-actor-expiry,authentication,authentication-ds-actor-expiry,Including an expiry time,"ds_actor cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards.
To include an expiry, add a ""e"" key to the cookie value containing a base62-encoded integer representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours:
import time
from datasette.utils import baseconv
expires_at = int(time.time()) + (24 * 60 * 60)
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign(
{
""a"": {""id"": ""cleopaws""},
""e"": baseconv.base62.encode(expires_at),
},
""actor"",
),
)
The resulting cookie will encode data that looks something like this:
{
""a"": {
""id"": ""cleopaws""
},
""e"": ""1jjSji""
}","[""Authentication and permissions"", ""The ds_actor cookie""]",[]
authentication:authentication-ds-actor,authentication,authentication-ds-actor,The ds_actor cookie,"Datasette includes a default authentication plugin which looks for a signed ds_actor cookie containing a JSON actor dictionary. This is how the root actor mechanism works.
Authentication plugins can set signed ds_actor cookies themselves like so:
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""),
)
Note that you need to pass ""actor"" as the namespace to .sign(value, namespace=""default"") .
The shape of data encoded in the cookie is as follows:
{
""a"": {... actor ...}
}","[""Authentication and permissions""]",[]
authentication:authentication-actor-matches-allow,authentication,authentication-actor-matches-allow,actor_matches_allow(),"Plugins that wish to implement this same ""allow"" block permissions scheme can take advantage of the datasette.utils.actor_matches_allow(actor, allow) function:
from datasette.utils import actor_matches_allow
actor_matches_allow({""id"": ""root""}, {""id"": ""*""})
# returns True
The currently authenticated actor is made available to plugins as request.actor .","[""Authentication and permissions""]",[]
authentication:authentication-actor,authentication,authentication-actor,Actors,"Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word ""actor"" is used to cover both of these cases.
Every request to Datasette has an associated actor value, available in the code as request.actor . This can be None for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents.
The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an ""id"" string, as demonstrated by the ""root"" actor below.
Plugins can use the actor_from_request(datasette, request) hook to implement custom logic for authenticating an actor based on the incoming HTTP request.","[""Authentication and permissions""]",[]
authentication:authentication,authentication,authentication,Authentication and permissions,"Datasette does not require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries.
Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.",[],[]
changelog:authentication,changelog,authentication,Authentication,"Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github .
0.44 introduces Authentication and permissions as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example.
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root command-line option, which outputs a one-time use URL to authenticate as a root actor ( #784 ):
$ datasette fixtures.db --root
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://github.com/simonw/datasette/issues/699"", ""label"": ""#699""}, {""href"": ""https://github.com/simonw/datasette/issues/784"", ""label"": ""#784""}]"
changelog:asgi,changelog,asgi,ASGI,"ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic .
I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down .
The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://asgi.readthedocs.io/"", ""label"": ""ASGI""}, {""href"": ""https://github.com/simonw/datasette/issues/272"", ""label"": ""Port Datasette to ASGI #272""}, {""href"": ""https://www.uvicorn.org/"", ""label"": ""Uvicorn""}, {""href"": ""https://github.com/huge-success/sanic"", ""label"": ""Sanic""}, {""href"": ""https://simonwillison.net/2019/Jun/23/datasette-asgi/"", ""label"": ""Porting Datasette to ASGI, and Turtles all the way down""}]"
deploying:apache-proxy-configuration,deploying,apache-proxy-configuration,Apache proxy configuration,"For Apache , you can use the ProxyPass directive. First make sure the following lines are uncommented:
LoadModule proxy_module lib/httpd/modules/mod_proxy.so
LoadModule proxy_http_module lib/httpd/modules/mod_proxy_http.so
Then add these directives to proxy traffic:
ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/
ProxyPreserveHost On
A live demo of Datasette running behind Apache using this proxy setup can be seen at datasette-apache-proxy-demo.datasette.io/prefix/ . The code for that demo can be found in the demos/apache-proxy directory.
Using --uds you can use Unix domain sockets similar to the nginx example:
ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/
The ProxyPreserveHost On directive ensures that the original Host: header from the incoming request is passed through to Datasette. Datasette needs this to correctly assemble links to other pages using the .absolute_url(request, path) method.","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://httpd.apache.org/"", ""label"": ""Apache""}, {""href"": ""https://datasette-apache-proxy-demo.datasette.io/prefix/"", ""label"": ""datasette-apache-proxy-demo.datasette.io/prefix/""}, {""href"": ""https://github.com/simonw/datasette/tree/main/demos/apache-proxy"", ""label"": ""demos/apache-proxy""}, {""href"": ""https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#proxypreservehost"", ""label"": ""ProxyPreserveHost On""}]"
authentication:allowdebugview,authentication,allowdebugview,The /-/allow-debug tool,"The /-/allow-debug tool lets you try out different ""action"" blocks against different ""actor"" JSON objects. You can try that out here: https://latest.datasette.io/-/allow-debug","[""Authentication and permissions"", ""Permissions""]","[{""href"": ""https://latest.datasette.io/-/allow-debug"", ""label"": ""https://latest.datasette.io/-/allow-debug""}]"