id,page,ref,title,content,breadcrumbs,references internals:internals-tracer-trace-child-tasks,internals,internals-tracer-trace-child-tasks,Tracing child tasks,"If your code uses a mechanism such as asyncio.gather() to execute code in additional tasks you may find that some of the traces are missing from the display. You can use the trace_child_tasks() context manager to ensure these child tasks are correctly handled. from datasette import tracer with tracer.trace_child_tasks(): results = await asyncio.gather( # ... async tasks here ) This example uses the register_routes() plugin hook to add a page at /parallel-queries which executes two SQL queries in parallel using asyncio.gather() and returns their results. from datasette import hookimpl from datasette import tracer @hookimpl def register_routes(): async def parallel_queries(datasette): db = datasette.get_database() with tracer.trace_child_tasks(): one, two = await asyncio.gather( db.execute(""select 1""), db.execute(""select 2""), ) return Response.json( { ""one"": one.single_value(), ""two"": two.single_value(), } ) return [ (r""/parallel-queries$"", parallel_queries), ] Adding ?_trace=1 will show that the trace covers both of those child tasks.","[""Internals for plugins"", ""datasette.tracer""]",[] full_text_search:configuring-fts-using-csvs-to-sqlite,full_text_search,configuring-fts-using-csvs-to-sqlite,Configuring FTS using csvs-to-sqlite,"If your data starts out in CSV files, you can use Datasette's companion tool csvs-to-sqlite to convert that file into a SQLite database and enable full-text search on specific columns. For a file called items.csv where you want full-text search to operate against the name and description columns you would run the following: $ csvs-to-sqlite items.csv items.db -f name -f description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://github.com/simonw/csvs-to-sqlite"", ""label"": ""csvs-to-sqlite""}]" performance:http-caching,performance,http-caching,HTTP caching,"If your database is immutable and guaranteed not to change, you can gain major performance improvements from Datasette by enabling HTTP caching. This can work at two different levels. First, it can tell browsers to cache the results of queries and serve future requests from the browser cache. More significantly, it allows you to run Datasette behind a caching proxy such as Varnish or use a cache provided by a hosted service such as Fastly or Cloudflare . This can provide incredible speed-ups since a query only needs to be executed by Datasette the first time it is accessed - all subsequent hits can then be served by the cache. Using a caching proxy in this way could enable a Datasette-backed visualization to serve thousands of hits a second while running Datasette itself on extremely inexpensive hosting. Datasette's integration with HTTP caches can be enabled using a combination of configuration options and query string arguments. The default_cache_ttl setting sets the default HTTP cache TTL for all Datasette pages. This is 5 seconds unless you change it - you can set it to 0 if you wish to disable HTTP caching entirely. You can also change the cache timeout on a per-request basis using the ?_ttl=10 query string parameter. This can be useful when you are working with the Datasette JSON API - you may decide that a specific query can be cached for a longer time, or maybe you need to set ?_ttl=0 for some requests for example if you are running a SQL order by random() query.","[""Performance and caching""]","[{""href"": ""https://varnish-cache.org/"", ""label"": ""Varnish""}, {""href"": ""https://www.fastly.com/"", ""label"": ""Fastly""}, {""href"": ""https://www.cloudflare.com/"", ""label"": ""Cloudflare""}]" writing_plugins:writing-plugins-static-assets,writing_plugins,writing-plugins-static-assets,Static assets,"If your plugin has a static/ directory, Datasette will automatically configure itself to serve those static assets from the following path: /-/static-plugins/NAME_OF_PLUGIN_PACKAGE/yourfile.js Use the datasette.urls.static_plugins(plugin_name, path) method to generate URLs to that asset that take the base_url setting into account, see datasette.urls . To bundle the static assets for a plugin in the package that you publish to PyPI, add the following to the plugin's setup.py : package_data = ( { ""datasette_plugin_name"": [ ""static/plugin.js"", ], }, ) Where datasette_plugin_name is the name of the plugin package (note that it uses underscores, not hyphens) and static/plugin.js is the path within that package to the static file. datasette-cluster-map is a useful example of a plugin that includes packaged static assets in this way.","[""Writing plugins""]","[{""href"": ""https://github.com/simonw/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}]" writing_plugins:writing-plugins-custom-templates,writing_plugins,writing-plugins-custom-templates,Custom templates,"If your plugin has a templates/ directory, Datasette will attempt to load templates from that directory before it uses its own default templates. The priority order for template loading is: templates from the --template-dir argument, if specified templates from the templates/ directory in any installed plugins default templates that ship with Datasette See Custom pages and templates for more details on how to write custom templates, including which filenames to use to customize which parts of the Datasette UI. Templates should be bundled for distribution using the same package_data mechanism in setup.py described for static assets above, for example: package_data = ( { ""datasette_plugin_name"": [ ""templates/my_template.html"", ], }, ) You can also use wildcards here such as templates/*.html . See datasette-edit-schema for an example of this pattern.","[""Writing plugins""]","[{""href"": ""https://github.com/simonw/datasette-edit-schema"", ""label"": ""datasette-edit-schema""}]" testing_plugins:testing-plugins-pytest-httpx,testing_plugins,testing-plugins-pytest-httpx,Testing outbound HTTP calls with pytest-httpx,"If your plugin makes outbound HTTP calls - for example datasette-auth-github or datasette-import-table - you may need to mock those HTTP requests in your tests. The pytest-httpx package is a useful library for mocking calls. It can be tricky to use with Datasette though since it mocks all HTTPX requests, and Datasette's own testing mechanism uses HTTPX internally. To avoid breaking your tests, you can return [""localhost""] from the non_mocked_hosts() fixture. As an example, here's a very simple plugin which executes an HTTP response and returns the resulting content: from datasette import hookimpl from datasette.utils.asgi import Response import httpx @hookimpl def register_routes(): return [ (r""^/-/fetch-url$"", fetch_url), ] async def fetch_url(datasette, request): if request.method == ""GET"": return Response.html( """"""
"""""".format( request.scope[""csrftoken""]() ) ) vars = await request.post_vars() url = vars[""url""] return Response.text(httpx.get(url).text) Here's a test for that plugin that mocks the HTTPX outbound request: from datasette.app import Datasette import pytest @pytest.fixture def non_mocked_hosts(): # This ensures httpx-mock will not affect Datasette's own # httpx calls made in the tests by datasette.client: return [""localhost""] async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url=""https://www.example.com/"", text=""Hello world"", ) datasette = Datasette([], memory=True) response = await datasette.client.post( ""/-/fetch-url"", data={""url"": ""https://www.example.com/""}, ) assert response.text == ""Hello world"" outbound_request = httpx_mock.get_request() assert ( outbound_request.url == ""https://www.example.com/"" )","[""Testing plugins""]","[{""href"": ""https://pypi.org/project/pytest-httpx/"", ""label"": ""pytest-httpx""}]" changelog:id42,changelog,id42,0.51.1 (2020-10-31),Improvements to the new Binary data documentation page.,"[""Changelog""]",[] internals:internals-response-asgi-send,internals,internals-response-asgi-send,Returning a response with .asgi_send(send),"In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook. Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example: async def require_authorization(scope, receive, send): response = Response.text( ""401 Authorization Required"", headers={ ""www-authenticate"": 'Basic realm=""Datasette"", charset=""UTF-8""' }, status=401, ) await response.asgi_send(send)","[""Internals for plugins"", ""Response class""]",[] changelog:id17,changelog,id17,0.61 (2022-03-23),"In preparation for Datasette 1.0, this release includes two potentially backwards-incompatible changes. Hashed URL mode has been moved to a separate plugin, and the way Datasette generates URLs to databases and tables with special characters in their name such as / and . has changed. Datasette also now requires Python 3.7 or higher. URLs within Datasette now use a different encoding scheme for tables or databases that include ""special"" characters outside of the range of a-zA-Z0-9_- . This scheme is explained here: Tilde encoding . ( #1657 ) Removed hashed URL mode from Datasette. The new datasette-hashed-urls plugin can be used to achieve the same result, see datasette-hashed-urls for details. ( #1661 ) Databases can now have a custom path within the Datasette instance that is independent of the database name, using the db.route property. ( #1668 ) Datasette is now covered by a Code of Conduct . ( #1654 ) Python 3.6 is no longer supported. ( #1577 ) Tests now run against Python 3.11-dev. ( #1621 ) New datasette.ensure_permissions(actor, permissions) internal method for checking multiple permissions at once. ( #1675 ) New datasette.check_visibility(actor, action, resource=None) internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. ( #1678 ) Table and row HTML pages now include a element and return a Link: URL; rel=""alternate""; type=""application/json+datasette"" HTTP header pointing to the JSON version of those pages. ( #1533 ) Access-Control-Expose-Headers: Link is now added to the CORS headers, allowing remote JavaScript to access that header. Canned queries are now shown at the top of the database page, directly below the SQL editor. Previously they were shown at the bottom, below the list of tables. ( #1612 ) Datasette now has a default favicon. ( #1603 ) sqlite_stat tables are now hidden by default. ( #1587 ) SpatiaLite tables data_licenses , KNN and KNN2 are now hidden by default. ( #1601 ) SQL query tracing mechanism now works for queries executed in asyncio sub-tasks, such as those created by asyncio.gather() . ( #1576 ) datasette.tracer mechanism is now documented. Common Datasette symbols can now be imported directly from the top-level datasette package, see Import shortcuts . Those symbols are Response , Forbidden , NotFound , hookimpl , actor_matches_allow . ( #957 ) /-/versions page now returns additional details for libraries used by SpatiaLite. ( #1607 ) Documentation now links to the Datasette Tutorials . Datasette will now also look for SpatiaLite in /opt/homebrew - thanks, Dan Peterson. ( #1649 ) Fixed bug where custom pages did not work on Windows. Thanks, Robert Christie. ( #1545 ) Fixed error caused when a table had a column named n . ( #1228 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1657"", ""label"": ""#1657""}, {""href"": ""https://github.com/simonw/datasette/issues/1661"", ""label"": ""#1661""}, {""href"": ""https://github.com/simonw/datasette/issues/1668"", ""label"": ""#1668""}, {""href"": ""https://github.com/simonw/datasette/blob/main/CODE_OF_CONDUCT.md"", ""label"": ""Code of Conduct""}, {""href"": ""https://github.com/simonw/datasette/issues/1654"", ""label"": ""#1654""}, {""href"": ""https://github.com/simonw/datasette/issues/1577"", ""label"": ""#1577""}, {""href"": ""https://github.com/simonw/datasette/issues/1621"", ""label"": ""#1621""}, {""href"": ""https://github.com/simonw/datasette/issues/1675"", ""label"": ""#1675""}, {""href"": ""https://github.com/simonw/datasette/issues/1678"", ""label"": ""#1678""}, {""href"": ""https://github.com/simonw/datasette/issues/1533"", ""label"": ""#1533""}, {""href"": ""https://github.com/simonw/datasette/issues/1612"", ""label"": ""#1612""}, {""href"": ""https://github.com/simonw/datasette/issues/1603"", ""label"": ""#1603""}, {""href"": ""https://github.com/simonw/datasette/issues/1587"", ""label"": ""#1587""}, {""href"": ""https://github.com/simonw/datasette/issues/1601"", ""label"": ""#1601""}, {""href"": ""https://github.com/simonw/datasette/issues/1576"", ""label"": ""#1576""}, {""href"": ""https://github.com/simonw/datasette/issues/957"", ""label"": ""#957""}, {""href"": ""https://github.com/simonw/datasette/issues/1607"", ""label"": ""#1607""}, {""href"": ""https://datasette.io/tutorials"", ""label"": ""Datasette Tutorials""}, {""href"": ""https://github.com/simonw/datasette/pull/1649"", ""label"": ""#1649""}, {""href"": ""https://github.com/simonw/datasette/issues/1545"", ""label"": ""#1545""}, {""href"": ""https://github.com/simonw/datasette/issues/1228"", ""label"": ""#1228""}]" settings:setting-truncate-cells-html,settings,setting-truncate-cells-html,truncate_cells_html,"In the HTML table view, truncate any strings that are longer than this value. The full value will still be available in CSV, JSON and on the individual row HTML page. Set this to 0 to disable truncation. datasette mydatabase.db --setting truncate_cells_html 0","[""Settings"", ""Settings""]",[] cli-reference:cli-help-install-help,cli-reference,cli-help-install-help,datasette install,"Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette. This command: datasette install datasette-cluster-map Would install the datasette-cluster-map plugin. [[[cog help([""install"", ""--help""]) ]]] Usage: datasette install [OPTIONS] PACKAGES... Install plugins and packages from PyPI into the same environment as Datasette Options: -U, --upgrade Upgrade packages to latest version --help Show this message and exit. [[[end]]]","[""CLI reference""]","[{""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}]" internals:internals-database,internals,internals-database,Database class,"Instances of the Database class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas.","[""Internals for plugins""]",[] changelog:javascript-modules,changelog,javascript-modules,JavaScript modules,"JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords. To use modules, JavaScript needs to be included in You can also specify a SRI (subresource integrity hash) for these assets: { ""extra_css_urls"": [ { ""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"", ""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"" } ], ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } This will produce: Modern browsers will only execute the stylesheet or JavaScript if the SRI hash matches the content served. You can generate hashes using www.srihash.org Items in ""extra_js_urls"" can specify ""module"": true if they reference JavaScript that uses JavaScript modules . This configuration: { ""extra_js_urls"": [ { ""url"": ""https://example.datasette.io/module.js"", ""module"": true } ] } Will produce this HTML: ","[""Custom pages and templates""]","[{""href"": ""https://www.srihash.org/"", ""label"": ""www.srihash.org""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules"", ""label"": ""JavaScript modules""}]" changelog:smaller-changes,changelog,smaller-changes,Smaller changes,"Wide tables shown within Datasette now scroll horizontally ( #998 ). This is achieved using a new
element which may impact the implementation of some plugins (for example this change to datasette-cluster-map ). New debug-menu permission. ( #1068 ) Removed --debug option, which didn't do anything. ( #814 ) Link: HTTP header pagination. ( #1014 ) x button for clearing filters. ( #1016 ) Edit SQL button on canned queries, ( #1019 ) --load-extension=spatialite shortcut. ( #1028 ) scale-in animation for column action menu. ( #1039 ) Option to pass a list of templates to .render_template() is now documented. ( #1045 ) New datasette.urls.static_plugins() method. ( #1033 ) datasette -o option now opens the most relevant page. ( #976 ) datasette --cors option now enables access to /database.db downloads. ( #1057 ) Database file downloads now implement cascading permissions, so you can download a database if you have view-database-download permission even if you do not have permission to access the Datasette instance. ( #1058 ) New documentation on Designing URLs for your plugin . ( #1053 )","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette/issues/998"", ""label"": ""#998""}, {""href"": ""https://github.com/simonw/datasette-cluster-map/commit/fcb4abbe7df9071c5ab57defd39147de7145b34e"", ""label"": ""this change to datasette-cluster-map""}, {""href"": ""https://github.com/simonw/datasette/issues/1068"", ""label"": ""#1068""}, {""href"": ""https://github.com/simonw/datasette/issues/814"", ""label"": ""#814""}, {""href"": ""https://github.com/simonw/datasette/issues/1014"", ""label"": ""#1014""}, {""href"": ""https://github.com/simonw/datasette/issues/1016"", ""label"": ""#1016""}, {""href"": ""https://github.com/simonw/datasette/issues/1019"", ""label"": ""#1019""}, {""href"": ""https://github.com/simonw/datasette/issues/1028"", ""label"": ""#1028""}, {""href"": ""https://github.com/simonw/datasette/issues/1039"", ""label"": ""#1039""}, {""href"": ""https://github.com/simonw/datasette/issues/1045"", ""label"": ""#1045""}, {""href"": ""https://github.com/simonw/datasette/issues/1033"", ""label"": ""#1033""}, {""href"": ""https://github.com/simonw/datasette/issues/976"", ""label"": ""#976""}, {""href"": ""https://github.com/simonw/datasette/issues/1057"", ""label"": ""#1057""}, {""href"": ""https://github.com/simonw/datasette/issues/1058"", ""label"": ""#1058""}, {""href"": ""https://github.com/simonw/datasette/issues/1053"", ""label"": ""#1053""}]" sql_queries:canned-queries-json-api,sql_queries,canned-queries-json-api,JSON API for writable canned queries,"Writable canned queries can also be accessed using a JSON API. You can POST data to them using JSON, and you can request that their response is returned to you as JSON. To submit JSON to a writable canned query, encode key/value parameters as a JSON document: POST /mydatabase/add_message {""message"": ""Message goes here""} You can also continue to submit data using regular form encoding, like so: POST /mydatabase/add_message message=Message+goes+here There are three options for specifying that you would like the response to your request to return JSON data, as opposed to an HTTP redirect to another page. Set an Accept: application/json header on your request Include ?_json=1 in the URL that you POST to Include ""_json"": 1 in your JSON body, or &_json=1 in your form encoded body The JSON response will look like this: { ""ok"": true, ""message"": ""Query executed, 1 row affected"", ""redirect"": ""/data/add_name"" } The ""message"" and ""redirect"" values here will take into account on_success_message , on_success_redirect , on_error_message and on_error_redirect , if they have been set.","[""Running SQL queries"", ""Canned queries""]",[] changelog:flash-messages,changelog,flash-messages,Flash messages,"Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system ( #790 ) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see .add_message(request, message, type=datasette.INFO) for details. You can try out the new messages using the /-/messages debug tool, for example at https://latest.datasette.io/-/messages","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette/issues/790"", ""label"": ""#790""}, {""href"": ""https://latest.datasette.io/-/messages"", ""label"": ""https://latest.datasette.io/-/messages""}]" custom_templates:id1,custom_templates,id1,Custom pages,"You can add templated pages to your Datasette instance by creating HTML files in a pages directory within your templates directory. For example, to add a custom page that is served at http://localhost/about you would create a file in templates/pages/about.html , then start Datasette like this: $ datasette mydb.db --template-dir=templates/ You can nest directories within pages to create a nested structure. To create a http://localhost:8001/about/map page you would create templates/pages/about/map.html .","[""Custom pages and templates""]",[] plugins:one-off-plugins-using-plugins-dir,plugins,one-off-plugins-using-plugins-dir,One-off plugins using --plugins-dir,"You can also define one-off per-project plugins by saving them as plugin_name.py functions in a plugins/ folder and then passing that folder to datasette using the --plugins-dir option: datasette mydb.db --plugins-dir=plugins/","[""Plugins"", ""Installing plugins""]",[] custom_templates:custom-pages-parameters,custom_templates,custom-pages-parameters,Path parameters for pages,"You can define custom pages that match multiple paths by creating files with {variable} definitions in their filenames. For example, to capture any request to a URL matching /about/* , you would create a template in the following location: templates/pages/about/{slug}.html A hit to /about/news would render that template and pass in a variable called slug with a value of ""news"" . If you use this mechanism don't forget to return a 404 if the referenced content could not be found. You can do this using {{ raise_404() }} described below. Templates defined using custom page routes work particularly well with the sql() template function from datasette-template-sql or the graphql() template function from datasette-graphql .","[""Custom pages and templates"", ""Custom pages""]","[{""href"": ""https://github.com/simonw/datasette-template-sql"", ""label"": ""datasette-template-sql""}, {""href"": ""https://github.com/simonw/datasette-graphql#the-graphql-template-function"", ""label"": ""datasette-graphql""}]" json_api:column-filter-arguments,json_api,column-filter-arguments,Column filter arguments,"You can filter the data returned by the table based on column values using a query string argument. ?column__exact=value or ?_column=value Returns rows where the specified column exactly matches the value. ?column__not=value Returns rows where the column does not match the value. ?column__contains=value Rows where the string column contains the specified value ( column like ""%value%"" in SQL). ?column__endswith=value Rows where the string column ends with the specified value ( column like ""%value"" in SQL). ?column__startswith=value Rows where the string column starts with the specified value ( column like ""value%"" in SQL). ?column__gt=value Rows which are greater than the specified value. ?column__gte=value Rows which are greater than or equal to the specified value. ?column__lt=value Rows which are less than the specified value. ?column__lte=value Rows which are less than or equal to the specified value. ?column__like=value Match rows with a LIKE clause, case insensitive and with % as the wildcard character. ?column__notlike=value Match rows that do not match the provided LIKE clause. ?column__glob=value Similar to LIKE but uses Unix wildcard syntax and is case sensitive. ?column__in=value1,value2,value3 Rows where column matches any of the provided values. You can use a comma separated string, or you can use a JSON array. The JSON array option is useful if one of your matching values itself contains a comma: ?column__in=[""value"",""value,with,commas""] ?column__notin=value1,value2,value3 Rows where column does not match any of the provided values. The inverse of __in= . Also supports JSON arrays. ?column__arraycontains=value Works against columns that contain JSON arrays - matches if any of the values in that array match the provided value. This is only available if the json1 SQLite extension is enabled. ?column__arraynotcontains=value Works against columns that contain JSON arrays - matches if none of the values in that array match the provided value. This is only available if the json1 SQLite extension is enabled. ?column__date=value Column is a datestamp occurring on the specified YYYY-MM-DD date, e.g. 2018-01-02 . ?column__isnull=1 Matches rows where the column is null. ?column__notnull=1 Matches rows where the column is not null. ?column__isblank=1 Matches rows where the column is blank, meaning null or the empty string. ?column__notblank=1 Matches rows where the column is not blank.","[""JSON API"", ""Table arguments""]",[] metadata:metadata-hiding-tables,metadata,metadata-hiding-tables,Hiding tables,"You can hide tables from the database listing view (in the same way that FTS and SpatiaLite tables are automatically hidden) using ""hidden"": true : { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""hidden"": true } } } } }","[""Metadata""]",[] metadata:metadata-column-descriptions,metadata,metadata-column-descriptions,Column descriptions,"You can include descriptions for your columns by adding a ""columns"": {""name-of-column"": ""description-of-column""} block to your table metadata: { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""columns"": { ""column1"": ""Description of column 1"", ""column2"": ""Description of column 2"" } } } } } } These will be displayed at the top of the table page, and will also show in the cog menu for each column. You can see an example of how these look at latest.datasette.io/fixtures/roadside_attractions .","[""Metadata""]","[{""href"": ""https://latest.datasette.io/fixtures/roadside_attractions"", ""label"": ""latest.datasette.io/fixtures/roadside_attractions""}]" full_text_search:full-text-search-custom-sql,full_text_search,full-text-search-custom-sql,Searches using custom SQL,"You can include full-text search results in custom SQL queries. The general pattern with SQLite search is to run the search as a sub-select that returns rowid values, then include those rowids in another part of the query. You can see the syntax for a basic search by running that search on a table page and then clicking ""View and edit SQL"" to see the underlying SQL. For example, consider this search for manafort is the US FARA database : /fara/FARA_All_ShortForms?_search=manafort If you click View and edit SQL you'll see that the underlying SQL looks like this: select rowid, Short_Form_Termination_Date, Short_Form_Date, Short_Form_Last_Name, Short_Form_First_Name, Registration_Number, Registration_Date, Registrant_Name, Address_1, Address_2, City, State, Zip from FARA_All_ShortForms where rowid in ( select rowid from FARA_All_ShortForms_fts where FARA_All_ShortForms_fts match escape_fts(:search) ) order by rowid limit 101","[""Full-text search""]","[{""href"": ""https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort"", ""label"": ""manafort is the US FARA database""}, {""href"": ""https://fara.datasettes.com/fara?sql=select%0D%0A++rowid%2C%0D%0A++Short_Form_Termination_Date%2C%0D%0A++Short_Form_Date%2C%0D%0A++Short_Form_Last_Name%2C%0D%0A++Short_Form_First_Name%2C%0D%0A++Registration_Number%2C%0D%0A++Registration_Date%2C%0D%0A++Registrant_Name%2C%0D%0A++Address_1%2C%0D%0A++Address_2%2C%0D%0A++City%2C%0D%0A++State%2C%0D%0A++Zip%0D%0Afrom%0D%0A++FARA_All_ShortForms%0D%0Awhere%0D%0A++rowid+in+%28%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++FARA_All_ShortForms_fts%0D%0A++++where%0D%0A++++++FARA_All_ShortForms_fts+match+escape_fts%28%3Asearch%29%0D%0A++%29%0D%0Aorder+by%0D%0A++rowid%0D%0Alimit%0D%0A++101&search=manafort"", ""label"": ""View and edit SQL""}]" installation:installing-plugins-using-pipx,installation,installing-plugins-using-pipx,Installing plugins using pipx,"You can install additional datasette plugins with pipx inject like so: $ pipx inject datasette datasette-json-html injected package datasette-json-html into venv datasette done! ✨ 🌟 ✨ $ datasette plugins [ { ""name"": ""datasette-json-html"", ""static"": false, ""templates"": false, ""version"": ""0.6"" } ]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[] authentication:authentication-permissions-metadata,authentication,authentication-permissions-metadata,Configuring permissions in metadata.json,"You can limit who is allowed to view different parts of your Datasette instance using ""allow"" keys in your Metadata configuration. You can control the following: Access to the entire Datasette instance Access to specific databases Access to specific tables and views Access to specific Canned queries If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.","[""Authentication and permissions""]",[] changelog:id61,changelog,id61,0.41 (2020-05-06),"You can now create custom pages within your Datasette instance using a custom template file. For example, adding a template file called templates/pages/about.html will result in a new page being served at /about on your instance. See the custom pages documentation for full details, including how to return custom HTTP headers, redirects and status codes. ( #648 ) Configuration directory mode ( #731 ) allows you to define a custom Datasette instance as a directory. So instead of running the following: $ datasette one.db two.db \ --metadata=metadata.json \ --template-dir=templates/ \ --plugins-dir=plugins \ --static css:css You can instead arrange your files in a single directory called my-project and run this: $ datasette my-project/ Also in this release: New NOT LIKE table filter: ?colname__notlike=expression . ( #750 ) Datasette now has a pattern portfolio at /-/patterns - e.g. https://latest.datasette.io/-/patterns . This is a page that shows every Datasette user interface component in one place, to aid core development and people building custom CSS themes. ( #151 ) SQLite PRAGMA functions such as pragma_table_info(tablename) are now allowed in Datasette SQL queries. ( #761 ) Datasette pages now consistently return a content-type of text/html; charset=utf-8"" . ( #752 ) Datasette now handles an ASGI raw_path value of None , which should allow compatibility with the Mangum adapter for running ASGI apps on AWS Lambda. Thanks, Colin Dellow. ( #719 ) Installation documentation now covers how to Using pipx . ( #756 ) Improved the documentation for Full-text search . ( #748 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/648"", ""label"": ""#648""}, {""href"": ""https://github.com/simonw/datasette/issues/731"", ""label"": ""#731""}, {""href"": ""https://github.com/simonw/datasette/issues/750"", ""label"": ""#750""}, {""href"": ""https://latest.datasette.io/-/patterns"", ""label"": ""https://latest.datasette.io/-/patterns""}, {""href"": ""https://github.com/simonw/datasette/issues/151"", ""label"": ""#151""}, {""href"": ""https://www.sqlite.org/pragma.html#pragfunc"", ""label"": ""PRAGMA functions""}, {""href"": ""https://github.com/simonw/datasette/issues/761"", ""label"": ""#761""}, {""href"": ""https://github.com/simonw/datasette/issues/752"", ""label"": ""#752""}, {""href"": ""https://github.com/erm/mangum"", ""label"": ""Mangum""}, {""href"": ""https://github.com/simonw/datasette/pull/719"", ""label"": ""#719""}, {""href"": ""https://github.com/simonw/datasette/issues/756"", ""label"": ""#756""}, {""href"": ""https://github.com/simonw/datasette/issues/748"", ""label"": ""#748""}]" changelog:control-http-caching-with-ttl,changelog,control-http-caching-with-ttl,Control HTTP caching with ?_ttl=,"You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter. You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. Consider for example this query which returns a randomly selected member of the Avengers: select * from [avengers/avengers] order by random() limit 1 If you hit the following page repeatedly you will get the same result, due to HTTP caching: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0""}]" writing_plugins:writing-plugins-designing-urls,writing_plugins,writing-plugins-designing-urls,Designing URLs for your plugin,"You can register new URL routes within Datasette using the register_routes(datasette) plugin hook. Datasette's default URLs include these: /dbname - database page /dbname/tablename - table page /dbname/tablename/pk - row page See Pages and API endpoints and Introspection for more default URL routes. To avoid accidentally conflicting with a database file that may be loaded into Datasette, plugins should register URLs using a /-/ prefix. For example, if your plugin adds a new interface for uploading Excel files you might register a URL route like this one: /-/upload-excel Try to avoid registering URLs that clash with other plugins that your users might have installed. There is no central repository of reserved URL paths (yet) but you can review existing plugins by browsing the plugins directory . If your plugin includes functionality that relates to a specific database you could also register a URL route like this: /dbname/-/upload-excel Or for a specific table like this: /dbname/tablename/-/modify-table-schema Note that a row could have a primary key of - and this URL scheme will still work, because Datasette row pages do not ever have a trailing slash followed by additional path components.","[""Writing plugins""]",[] deploying:deploying-systemd,deploying,deploying-systemd,Running Datasette using systemd,"You can run Datasette on Ubuntu or Debian systems using systemd . First, ensure you have Python 3 and pip installed. On Ubuntu you can use sudo apt-get install python3 python3-pip . You can install Datasette into a virtual environment, or you can install it system-wide. To install system-wide, use sudo pip3 install datasette . Now create a folder for your Datasette databases, for example using mkdir /home/ubuntu/datasette-root . You can copy a test database into that folder like so: cd /home/ubuntu/datasette-root curl -O https://latest.datasette.io/fixtures.db Create a file at /etc/systemd/system/datasette.service with the following contents: [Unit] Description=Datasette After=network.target [Service] Type=simple User=ubuntu Environment=DATASETTE_SECRET= WorkingDirectory=/home/ubuntu/datasette-root ExecStart=datasette serve . -h 127.0.0.1 -p 8000 Restart=on-failure [Install] WantedBy=multi-user.target Add a random value for the DATASETTE_SECRET - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so: $ python3 -c 'import secrets; print(secrets.token_hex(32))' This configuration will run Datasette against all database files contained in the /home/ubuntu/datasette-root directory. If that directory contains a metadata.yml (or .json ) file or a templates/ or plugins/ sub-directory those will automatically be loaded by Datasette - see Configuration directory mode for details. You can start the Datasette process running using the following: sudo systemctl daemon-reload sudo systemctl start datasette.service You will need to restart the Datasette service after making changes to its metadata.json configuration or adding a new database file to that directory. You can do that using: sudo systemctl restart datasette.service Once the service has started you can confirm that Datasette is running on port 8000 like so: curl 127.0.0.1:8000/-/versions.json # Should output JSON showing the installed version Datasette will not be accessible from outside the server because it is listening on 127.0.0.1 . You can expose it by instead listening on 0.0.0.0 , but a better way is to set up a proxy such as nginx - see Running Datasette behind a proxy .","[""Deploying Datasette""]",[] plugins:plugins-installed,plugins,plugins-installed,Seeing what plugins are installed,"You can see a list of installed plugins by navigating to the /-/plugins page of your Datasette instance - for example: https://fivethirtyeight.datasettes.com/-/plugins You can also use the datasette plugins command: $ datasette plugins [ { ""name"": ""datasette_json_html"", ""static"": false, ""templates"": false, ""version"": ""0.4.0"" } ] [[[cog from datasette import cli from click.testing import CliRunner import textwrap, json cog.out(""\n"") result = CliRunner().invoke(cli.cli, [""plugins"", ""--all""]) # cog.out() with text containing newlines was unindenting for some reason cog.outl(""If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::\n"") plugins = [p for p in json.loads(result.output) if p[""name""].startswith(""datasette."")] indented = textwrap.indent(json.dumps(plugins, indent=4), "" "") for line in indented.split(""\n""): cog.outl(line) cog.out(""\n\n"") ]]] If you run datasette plugins --all it will include default plugins that ship as part of Datasette: [ { ""name"": ""datasette.actor_auth_cookie"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""actor_from_request"" ] }, { ""name"": ""datasette.blob_renderer"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_output_renderer"" ] }, { ""name"": ""datasette.default_magic_parameters"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_magic_parameters"" ] }, { ""name"": ""datasette.default_menu_links"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""menu_links"" ] }, { ""name"": ""datasette.default_permissions"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""permission_allowed"" ] }, { ""name"": ""datasette.facets"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_facet_classes"" ] }, { ""name"": ""datasette.filters"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""filters_from_request"" ] }, { ""name"": ""datasette.forbidden"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""forbidden"" ] }, { ""name"": ""datasette.handle_exception"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""handle_exception"" ] }, { ""name"": ""datasette.publish.cloudrun"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""publish_subcommand"" ] }, { ""name"": ""datasette.publish.heroku"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""publish_subcommand"" ] }, { ""name"": ""datasette.sql_functions"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""prepare_connection"" ] } ] [[[end]]] You can add the --plugins-dir= option to include any plugins found in that directory.","[""Plugins""]","[{""href"": ""https://fivethirtyeight.datasettes.com/-/plugins"", ""label"": ""https://fivethirtyeight.datasettes.com/-/plugins""}]" facets:facets-metadata,facets,facets-metadata,Facets in metadata.json,"You can turn facets on by default for specific tables by adding them to a ""facets"" key in a Datasette Metadata file. Here's an example that turns on faceting by default for the qLegalStatus column in the Street_Tree_List table in the sf-trees database: { ""databases"": { ""sf-trees"": { ""tables"": { ""Street_Tree_List"": { ""facets"": [""qLegalStatus""] } } } } } Facets defined in this way will always be shown in the interface and returned in the API, regardless of the _facet arguments passed to the view. You can specify array or date facets in metadata using JSON objects with a single key of array or date and a value specifying the column, like this: { ""facets"": [ {""array"": ""tags""}, {""date"": ""created""} ] } You can change the default facet size (the number of results shown for each facet) for a table using facet_size : { ""databases"": { ""sf-trees"": { ""tables"": { ""Street_Tree_List"": { ""facets"": [""qLegalStatus""], ""facet_size"": 10 } } } } }","[""Facets""]",[] installation:upgrading-packages-using-pipx,installation,upgrading-packages-using-pipx,Upgrading packages using pipx,"You can upgrade your pipx installation to the latest release of Datasette using pipx upgrade datasette : $ pipx upgrade datasette upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette) To upgrade a plugin within the pipx environment use pipx runpip datasette install -U name-of-plugin - like this: % datasette plugins [ { ""name"": ""datasette-vega"", ""static"": true, ""templates"": false, ""version"": ""0.6"" } ] $ pipx runpip datasette install -U datasette-vega Collecting datasette-vega Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB) |████████████████████████████████| 1.8 MB 2.0 MB/s ... Installing collected packages: datasette-vega Attempting uninstall: datasette-vega Found existing installation: datasette-vega 0.6 Uninstalling datasette-vega-0.6: Successfully uninstalled datasette-vega-0.6 Successfully installed datasette-vega-0.6.2 $ datasette plugins [ { ""name"": ""datasette-vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"" } ]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[] custom_templates:custom-pages-redirects,custom_templates,custom-pages-redirects,Custom redirects,"You can use the custom_redirect(location) function to redirect users to another page, for example in a file called pages/datasette.html : {{ custom_redirect(""https://github.com/simonw/datasette"") }} Now requests to http://localhost:8001/datasette will result in a redirect. These redirects are served with a 302 Found status code by default. You can send a 301 Moved Permanently code by passing 301 as the second argument to the function: {{ custom_redirect(""https://github.com/simonw/datasette"", 301) }}","[""Custom pages and templates"", ""Custom pages""]",[] writing_plugins:id1,writing_plugins,id1,Writing plugins,"You can write one-off plugins that apply to just one Datasette instance, or you can write plugins which can be installed using pip and can be shipped to the Python Package Index ( PyPI ) for other people to install. Want to start by looking at an example? The Datasette plugins directory lists more than 90 open source plugins with code you can explore. The plugin hooks page includes links to example plugins for each of the documented hooks.",[],"[{""href"": ""https://pypi.org/"", ""label"": ""PyPI""}, {""href"": ""https://datasette.io/plugins"", ""label"": ""Datasette plugins directory""}]" deploying:deploying-proxy,deploying,deploying-proxy,Running Datasette behind a proxy,"You may wish to run Datasette behind an Apache or nginx proxy, using a path within your existing site. You can use the base_url configuration setting to tell Datasette to serve traffic with a specific URL prefix. For example, you could run Datasette like this: datasette my-database.db --setting base_url /my-datasette/ -p 8009 This will run Datasette with the following URLs: http://127.0.0.1:8009/my-datasette/ - the Datasette homepage http://127.0.0.1:8009/my-datasette/my-database - the page for the my-database.db database http://127.0.0.1:8009/my-datasette/my-database/some_table - the page for the some_table table You can now set your nginx or Apache server to proxy the /my-datasette/ path to this Datasette instance.","[""Deploying Datasette""]",[] changelog:id69,changelog,id69,0.34 (2020-01-29),"_search= queries are now correctly escaped using a new escape_fts() custom SQL function. This means you can now run searches for strings like park. without seeing errors. ( #651 ) Google Cloud Run is no longer in beta, so datasette publish cloudrun has been updated to work even if the user has not installed the gcloud beta components package. Thanks, Katie McLaughlin ( #660 ) datasette package now accepts a --port option for specifying which port the resulting Docker container should listen on. ( #661 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/651"", ""label"": ""#651""}, {""href"": ""https://cloud.google.com/run/"", ""label"": ""Google Cloud Run""}, {""href"": ""https://github.com/simonw/datasette/pull/660"", ""label"": ""#660""}, {""href"": ""https://github.com/simonw/datasette/issues/661"", ""label"": ""#661""}]" internals:datasette-permission-allowed,internals,datasette-permission-allowed,"await .permission_allowed(actor, action, resource=None, default=False)","actor - dictionary The authenticated actor. This is usually request.actor . action - string The name of the action that is being permission checked. resource - string or tuple, optional The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource. default - optional, True or False Should this permission check be default allow or default deny. Check if the given actor has permission to perform the given action on the given resource. Some permission checks are carried out against rules defined in metadata.json , while other custom permissions may be decided by plugins that implement the permission_allowed(datasette, actor, action, resource) plugin hook. If neither metadata.json nor any of the plugins provide an answer to the permission query the default argument will be returned. See Built-in permissions for a full list of permission actions included in Datasette core.","[""Internals for plugins"", ""Datasette class""]",[] internals:datasette-check-visibility,internals,datasette-check-visibility,"await .check_visibility(actor, action=None, resource=None, permissions=None)","actor - dictionary The authenticated actor. This is usually request.actor . action - string, optional The name of the action that is being permission checked. resource - string or tuple, optional The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource. permissions - list of action strings or (action, resource) tuples, optional Provide this instead of action and resource to check multiple permissions at once. This convenience method can be used to answer the question ""should this item be considered private, in that it is visible to me but it is not visible to anonymous users?"" It returns a tuple of two booleans, (visible, private) . visible indicates if the actor can see this resource. private will be True if an anonymous user would not be able to view the resource. This example checks if the user can access a specific table, and sets private so that a padlock icon can later be displayed: visible, private = await self.ds.check_visibility( request.actor, action=""view-table"", resource=(database, table), ) The following example runs three checks in a row, similar to await .ensure_permissions(actor, permissions) . If any of the checks are denied before one of them is explicitly granted then visible will be False . private will be True if an anonymous user would not be able to view the resource. visible, private = await self.ds.check_visibility( request.actor, permissions=[ (""view-table"", (database, table)), (""view-database"", database), ""view-instance"", ], )","[""Internals for plugins"", ""Datasette class""]",[] internals:datasette-ensure-permissions,internals,datasette-ensure-permissions,"await .ensure_permissions(actor, permissions)","actor - dictionary The authenticated actor. This is usually request.actor . permissions - list A list of permissions to check. Each permission in that list can be a string action name or a 2-tuple of (action, resource) . This method allows multiple permissions to be checked at once. It raises a datasette.Forbidden exception if any of the checks are denied before one of them is explicitly granted. This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns True or not a single one of them returns False : await self.ds.ensure_permissions( request.actor, [ (""view-table"", (database, table)), (""view-database"", database), ""view-instance"", ], )","[""Internals for plugins"", ""Datasette class""]",[] plugin_hooks:plugin-hook-register-commands,plugin_hooks,plugin-hook-register-commands,register_commands(cli),"cli - the root Datasette Click command group Use this to register additional CLI commands Register additional CLI commands that can be run using datsette yourcommand ... . This provides a mechanism by which plugins can add new CLI commands to Datasette. This example registers a new datasette verify file1.db file2.db command that checks if the provided file paths are valid SQLite databases: from datasette import hookimpl import click import sqlite3 @hookimpl def register_commands(cli): @cli.command() @click.argument( ""files"", type=click.Path(exists=True), nargs=-1 ) def verify(files): ""Verify that files can be opened by Datasette"" for file in files: conn = sqlite3.connect(str(file)) try: conn.execute(""select * from sqlite_master"") except sqlite3.DatabaseError: raise click.ClickException( ""Invalid database: {}"".format(file) ) The new command can then be executed like so: datasette verify fixtures.db Help text (from the docstring for the function plus any defined Click arguments or options) will become available using: datasette verify --help Plugins can register multiple commands by making multiple calls to the @cli.command() decorator. Consult the Click documentation for full details on how to build a CLI command, including how to define arguments and options. Note that register_commands() plugins cannot used with the --plugins-dir mechanism - they need to be installed into the same virtual environment as Datasette using pip install . Provided it has a setup.py file (see Packaging a plugin ) you can run pip install directly against the directory in which you are developing your plugin like so: pip install -e path/to/my/datasette-plugin Examples: datasette-auth-passwords , datasette-verify","[""Plugin hooks""]","[{""href"": ""https://click.palletsprojects.com/en/latest/commands/#callback-invocation"", ""label"": ""Click command group""}, {""href"": ""https://click.palletsprojects.com/"", ""label"": ""Click documentation""}, {""href"": ""https://datasette.io/plugins/datasette-auth-passwords"", ""label"": ""datasette-auth-passwords""}, {""href"": ""https://datasette.io/plugins/datasette-verify"", ""label"": ""datasette-verify""}]" plugin_hooks:plugin-hook-prepare-connection,plugin_hooks,plugin-hook-prepare-connection,"prepare_connection(conn, database, datasette)","conn - sqlite3 connection object The connection that is being opened database - string The name of the database datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) This hook is called when a new SQLite database connection is created. You can use it to register custom SQL functions , aggregates and collations. For example: from datasette import hookimpl import random @hookimpl def prepare_connection(conn): conn.create_function( ""random_integer"", 2, random.randint ) This registers a SQL function called random_integer which takes two arguments and can be called like this: select random_integer(1, 10); Examples: datasette-jellyfish , datasette-jq , datasette-haversine , datasette-rure","[""Plugin hooks""]","[{""href"": ""https://docs.python.org/2/library/sqlite3.html#sqlite3.Connection.create_function"", ""label"": ""register custom SQL functions""}, {""href"": ""https://datasette.io/plugins/datasette-jellyfish"", ""label"": ""datasette-jellyfish""}, {""href"": ""https://datasette.io/plugins/datasette-jq"", ""label"": ""datasette-jq""}, {""href"": ""https://datasette.io/plugins/datasette-haversine"", ""label"": ""datasette-haversine""}, {""href"": ""https://datasette.io/plugins/datasette-rure"", ""label"": ""datasette-rure""}]" plugin_hooks:plugin-register-routes,plugin_hooks,plugin-register-routes,register_routes(datasette),"datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) Register additional view functions to execute for specified URL routes. Return a list of (regex, view_function) pairs, something like this: from datasette import hookimpl, Response import html async def hello_from(request): name = request.url_vars[""name""] return Response.html( ""Hello from {}"".format(html.escape(name)) ) @hookimpl def register_routes(): return [(r""^/hello-from/(?P.*)$"", hello_from)] The view functions can take a number of different optional arguments. The corresponding argument will be passed to your function depending on its named parameters - a form of dependency injection. The optional view function arguments are as follows: datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - Request object The current HTTP request. scope - dictionary The incoming ASGI scope dictionary. send - function The ASGI send function. receive - function The ASGI receive function. The view function can be a regular function or an async def function, depending on if it needs to use any await APIs. The function can either return a Response class or it can return nothing and instead respond directly to the request using the ASGI send function (for advanced uses only). It can also raise the datasette.NotFound exception to return a 404 not found error, or the datasette.Forbidden exception for a 403 forbidden. See Designing URLs for your plugin for tips on designing the URL routes used by your plugin. Examples: datasette-auth-github , datasette-psutil","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://datasette.io/plugins/datasette-psutil"", ""label"": ""datasette-psutil""}]" plugin_hooks:plugin-register-output-renderer,plugin_hooks,plugin-register-output-renderer,register_output_renderer(datasette),"datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) Registers a new output renderer, to output data in a custom format. The hook function should return a dictionary, or a list of dictionaries, of the following shape: @hookimpl def register_output_renderer(datasette): return { ""extension"": ""test"", ""render"": render_demo, ""can_render"": can_render_demo, # Optional } This will register render_demo to be called when paths with the extension .test (for example /database.test , /database/table.test , or /database/table/row.test ) are requested. render_demo is a Python function. It can be a regular function or an async def render_demo() awaitable function, depending on if it needs to make any asynchronous calls. can_render_demo is a Python function (or async def function) which accepts the same arguments as render_demo but just returns True or False . It lets Datasette know if the current SQL query can be represented by the plugin - and hence influnce if a link to this output format is displayed in the user interface. If you omit the ""can_render"" key from the dictionary every query will be treated as being supported by the plugin. When a request is received, the ""render"" callback function is called with zero or more of the following arguments. Datasette will inspect your callback function and pass arguments that match its function signature. datasette - Datasette class For accessing plugin configuration and executing queries. columns - list of strings The names of the columns returned by this query. rows - list of sqlite3.Row objects The rows returned by the query. sql - string The SQL query that was executed. query_name - string or None If this was the execution of a canned query , the name of that query. database - string The name of the database. table - string or None The table or view, if one is being rendered. request - Request object The current HTTP request. view_name - string The name of the current view being called. index , database , table , and row are the most important ones. The callback function can return None , if it is unable to render the data, or a Response class that will be returned to the caller. It can also return a dictionary with the following keys. This format is deprecated as-of Datasette 0.49 and will be removed by Datasette 1.0. body - string or bytes, optional The response body, default empty content_type - string, optional The Content-Type header, default text/plain status_code - integer, optional The HTTP status code, default 200 headers - dictionary, optional Extra HTTP headers to be returned in the response. An example of an output renderer callback function: def render_demo(): return Response.text(""Hello World"") Here is a more complex example: async def render_demo(datasette, columns, rows): db = datasette.get_database() result = await db.execute(""select sqlite_version()"") first_row = "" | "".join(columns) lines = [first_row] lines.append(""="" * len(first_row)) for row in rows: lines.append("" | "".join(row)) return Response( ""\n"".join(lines), content_type=""text/plain; charset=utf-8"", headers={""x-sqlite-version"": result.first()[0]}, ) And here is an example can_render function which returns True only if the query results contain the columns atom_id , atom_title and atom_updated : def can_render_demo(columns): return { ""atom_id"", ""atom_title"", ""atom_updated"", }.issubset(columns) Examples: datasette-atom , datasette-ics , datasette-geojson , datasette-copyable","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-atom"", ""label"": ""datasette-atom""}, {""href"": ""https://datasette.io/plugins/datasette-ics"", ""label"": ""datasette-ics""}, {""href"": ""https://datasette.io/plugins/datasette-geojson"", ""label"": ""datasette-geojson""}, {""href"": ""https://datasette.io/plugins/datasette-copyable"", ""label"": ""datasette-copyable""}]" plugin_hooks:plugin-hook-permission-allowed,plugin_hooks,plugin-hook-permission-allowed,"permission_allowed(datasette, actor, action, resource)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary The current actor, as decided by actor_from_request(datasette, request) . action - string The action to be performed, e.g. ""edit-table"" . resource - string or None An identifier for the individual resource, e.g. the name of the table. Called to check that an actor has permission to perform an action on a resource. Can return True if the action is allowed, False if the action is not allowed or None if the plugin does not have an opinion one way or the other. Here's an example plugin which randomly selects if a permission should be allowed or denied, except for view-instance which always uses the default permission scheme instead. from datasette import hookimpl import random @hookimpl def permission_allowed(action): if action != ""view-instance"": # Return True or False at random return random.random() > 0.5 # Returning None falls back to default permissions This function can alternatively return an awaitable function which itself returns True , False or None . You can use this option if you need to execute additional database queries using await datasette.execute(...) . Here's an example that allows users to view the admin_log table only if their actor id is present in the admin_users table. It aso disallows arbitrary SQL queries for the staff.db database for all users. @hookimpl def permission_allowed(datasette, actor, action, resource): async def inner(): if action == ""execute-sql"" and resource == ""staff"": return False if action == ""view-table"" and resource == ( ""staff"", ""admin_log"", ): if not actor: return False user_id = actor[""id""] return await datasette.get_database( ""staff"" ).execute( ""select count(*) from admin_users where user_id = :user_id"", {""user_id"": user_id}, ) return inner See built-in permissions for a full list of permissions that are included in Datasette core. Example: datasette-permissions-sql","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-permissions-sql"", ""label"": ""datasette-permissions-sql""}]" plugin_hooks:plugin-hook-database-actions,plugin_hooks,plugin-hook-database-actions,"database_actions(datasette, actor, database, request)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary or None The currently authenticated actor . database - string The name of the database. request - Request object The current HTTP request. This hook is similar to table_actions(datasette, actor, database, table, request) but populates an actions menu on the database page. Example: datasette-graphql","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-graphql"", ""label"": ""datasette-graphql""}]" plugin_hooks:plugin-hook-table-actions,plugin_hooks,plugin-hook-table-actions,"table_actions(datasette, actor, database, table, request)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary or None The currently authenticated actor . database - string The name of the database. table - string The name of the table. request - Request object or None The current HTTP request. This can be None if the request object is not available. This hook allows table actions to be displayed in a menu accessed via an action icon at the top of the table page. It should return a list of {""href"": ""..."", ""label"": ""...""} menu items. It can alternatively return an async def awaitable function which returns a list of menu items. This example adds a new table action if the signed in user is ""root"" : from datasette import hookimpl @hookimpl def table_actions(datasette, actor, database, table): if actor and actor.get(""id"") == ""root"": return [ { ""href"": datasette.urls.path( ""/-/edit-schema/{}/{}"".format( database, table ) ), ""label"": ""Edit schema for this table"", } ] Example: datasette-graphql","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-graphql"", ""label"": ""datasette-graphql""}]" plugin_hooks:plugin-hook-menu-links,plugin_hooks,plugin-hook-menu-links,"menu_links(datasette, actor, request)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary or None The currently authenticated actor . request - Request object or None The current HTTP request. This can be None if the request object is not available. This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon. The hook should return a list of {""href"": ""..."", ""label"": ""...""} menu items. These will be added to the menu. It can alternatively return an async def awaitable function which returns a list of menu items. This example adds a new menu item but only if the signed in user is ""root"" : from datasette import hookimpl @hookimpl def menu_links(datasette, actor): if actor and actor.get(""id"") == ""root"": return [ { ""href"": datasette.urls.path( ""/-/edit-schema"" ), ""label"": ""Edit schema"", }, ] Using datasette.urls here ensures that links in the menu will take the base_url setting into account. Examples: datasette-search-all , datasette-graphql","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-search-all"", ""label"": ""datasette-search-all""}, {""href"": ""https://datasette.io/plugins/datasette-graphql"", ""label"": ""datasette-graphql""}]" plugin_hooks:plugin-hook-canned-queries,plugin_hooks,plugin-hook-canned-queries,"canned_queries(datasette, database, actor)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. database - string The name of the database. actor - dictionary or None The currently authenticated actor . Use this hook to return a dictionary of additional canned query definitions for the specified database. The return value should be the same shape as the JSON described in the canned query documentation. from datasette import hookimpl @hookimpl def canned_queries(datasette, database): if database == ""mydb"": return { ""my_query"": { ""sql"": ""select * from my_table where id > :min_id"" } } The hook can alternatively return an awaitable function that returns a list. Here's an example that returns queries that have been stored in the saved_queries database table, if one exists: from datasette import hookimpl @hookimpl def canned_queries(datasette, database): async def inner(): db = datasette.get_database(database) if await db.table_exists(""saved_queries""): results = await db.execute( ""select name, sql from saved_queries"" ) return { result[""name""]: {""sql"": result[""sql""]} for result in results } return inner The actor parameter can be used to include the currently authenticated actor in your decision. Here's an example that returns saved queries that were saved by that actor: from datasette import hookimpl @hookimpl def canned_queries(datasette, database, actor): async def inner(): db = datasette.get_database(database) if actor is not None and await db.table_exists( ""saved_queries"" ): results = await db.execute( ""select name, sql from saved_queries where actor_id = :id"", {""id"": actor[""id""]}, ) return { result[""name""]: {""sql"": result[""sql""]} for result in results } return inner Example: datasette-saved-queries","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-saved-queries"", ""label"": ""datasette-saved-queries""}]" plugin_hooks:plugin-hook-actor-from-request,plugin_hooks,plugin-hook-actor-from-request,"actor_from_request(datasette, request)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - Request object The current HTTP request. This is part of Datasette's authentication and permissions system . The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request. If it cannot authenticate an actor, it should return None . Otherwise it should return a dictionary representing that actor. Here's an example that authenticates the actor based on an incoming API key: from datasette import hookimpl import secrets SECRET_KEY = ""this-is-a-secret"" @hookimpl def actor_from_request(datasette, request): authorization = ( request.headers.get(""authorization"") or """" ) expected = ""Bearer {}"".format(SECRET_KEY) if secrets.compare_digest(authorization, expected): return {""id"": ""bot""} If you install this in your plugins directory you can test it like this: $ curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json Instead of returning a dictionary, this function can return an awaitable function which itself returns either None or a dictionary. This is useful for authentication functions that need to make a database query - for example: from datasette import hookimpl @hookimpl def actor_from_request(datasette, request): async def inner(): token = request.args.get(""_token"") if not token: return None # Look up ?_token=xxx in sessions table result = await datasette.get_database().execute( ""select count(*) from sessions where token = ?"", [token], ) if result.first()[0]: return {""token"": token} else: return None return inner Example: datasette-auth-tokens","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-auth-tokens"", ""label"": ""datasette-auth-tokens""}]" plugin_hooks:plugin-hook-skip-csrf,plugin_hooks,plugin-hook-skip-csrf,"skip_csrf(datasette, scope)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. scope - dictionary The ASGI scope for the incoming HTTP request. This hook can be used to skip CSRF protection for a specific incoming request. For example, you might have a custom path at /submit-comment which is designed to accept comments from anywhere, whether or not the incoming request originated on the site and has an accompanying CSRF token. This example will disable CSRF protection for that specific URL path: from datasette import hookimpl @hookimpl def skip_csrf(scope): return scope[""path""] == ""/submit-comment"" If any of the currently active skip_csrf() plugin hooks return True , CSRF protection will be skipped for the request.","[""Plugin hooks""]","[{""href"": ""https://asgi.readthedocs.io/en/latest/specs/www.html#http-connection-scope"", ""label"": ""ASGI scope""}]" plugin_hooks:plugin-hook-handle-exception,plugin_hooks,plugin-hook-handle-exception,"handle_exception(datasette, request, exception)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to render templates or execute SQL queries. request - Request object The current HTTP request. exception - Exception The exception that was raised. This hook is called any time an unexpected exception is raised. You can use it to record the exception. If your handler returns a Response object it will be returned to the client in place of the default Datasette error page. The handler can return a response directly, or it can return return an awaitable function that returns a response. This example logs an error to Sentry and then renders a custom error page: from datasette import hookimpl, Response import sentry_sdk @hookimpl def handle_exception(datasette, exception): sentry_sdk.capture_exception(exception) async def inner(): return Response.html( await datasette.render_template( ""custom_error.html"", request=request ) ) return inner Example: datasette-sentry","[""Plugin hooks""]","[{""href"": ""https://sentry.io/"", ""label"": ""Sentry""}, {""href"": ""https://datasette.io/plugins/datasette-sentry"", ""label"": ""datasette-sentry""}]" plugin_hooks:plugin-hook-forbidden,plugin_hooks,plugin-hook-forbidden,"forbidden(datasette, request, message)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to render templates or execute SQL queries. request - Request object The current HTTP request. message - string A message hinting at why the request was forbidden. Plugins can use this to customize how Datasette responds when a 403 Forbidden error occurs - usually because a page failed a permission check, see Permissions . If a plugin hook wishes to react to the error, it should return a Response object . This example returns a redirect to a /-/login page: from datasette import hookimpl from urllib.parse import urlencode @hookimpl def forbidden(request, message): return Response.redirect( ""/-/login?="" + urlencode({""message"": message}) ) The function can alternatively return an awaitable function if it needs to make any asynchronous method calls. This example renders a template: from datasette import hookimpl, Response @hookimpl def forbidden(datasette): async def inner(): return Response.html( await datasette.render_template( ""render_message.html"", request=request ) ) return inner","[""Plugin hooks""]",[] plugin_hooks:plugin-hook-register-magic-parameters,plugin_hooks,plugin-hook-register-magic-parameters,register_magic_parameters(datasette),"datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) . Magic parameters can be used to add automatic parameters to canned queries . This plugin hook allows additional magic parameters to be defined by plugins. Magic parameters all take this format: _prefix_rest_of_parameter . The prefix indicates which magic parameter function should be called - the rest of the parameter is passed as an argument to that function. To register a new function, return it as a tuple of (string prefix, function) from this hook. The function you register should take two arguments: key and request , where key is the rest_of_parameter portion of the parameter and request is the current Request object . This example registers two new magic parameters: :_request_http_version returning the HTTP version of the current request, and :_uuid_new which returns a new UUID: from uuid import uuid4 def uuid(key, request): if key == ""new"": return str(uuid4()) else: raise KeyError def request(key, request): if key == ""http_version"": return request.scope[""http_version""] else: raise KeyError @hookimpl def register_magic_parameters(datasette): return [ (""request"", request), (""uuid"", uuid), ]","[""Plugin hooks""]",[] plugin_hooks:plugin-hook-get-metadata,plugin_hooks,plugin-hook-get-metadata,"get_metadata(datasette, key, database, table)","datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) . actor - dictionary or None The currently authenticated actor . database - string or None The name of the database metadata is being asked for. table - string or None The name of the table. key - string or None The name of the key for which data is being asked for. This hook is responsible for returning a dictionary corresponding to Datasette Metadata . This function is passed the database , table and key which were passed to the upstream internal request for metadata. Regardless, it is important to return a global metadata object, where ""databases"": [] would be a top-level key. The dictionary returned here, will be merged with, and overwritten by, the contents of the physical metadata.yaml if one is present. The design of this plugin hook does not currently provide a mechanism for interacting with async code, and may change in the future. See issue 1384 . @hookimpl def get_metadata(datasette, key, database, table): metadata = { ""title"": ""This will be the Datasette landing page title!"", ""description"": get_instance_description(datasette), ""databases"": [], } for db_name, db_data_dict in get_my_database_meta( datasette, database, table, key ): metadata[""databases""][db_name] = db_data_dict # whatever we return here will be merged with any other plugins using this hook and # will be overwritten by a local metadata.yaml if one exists! return metadata Example: datasette-remote-metadata plugin","[""Plugin hooks""]","[{""href"": ""https://github.com/simonw/datasette/issues/1384"", ""label"": ""issue 1384""}, {""href"": ""https://datasette.io/plugins/datasette-remote-metadata"", ""label"": ""datasette-remote-metadata plugin""}]" publish:publish-custom-metadata-and-plugins,publish,publish-custom-metadata-and-plugins,Custom metadata and plugins,"datasette publish accepts a number of additional options which can be used to further customize your Datasette instance. You can define your own Metadata and deploy that with your instance like so: datasette publish cloudrun --service=my-service mydatabase.db -m metadata.json If you just want to set the title, license or source information you can do that directly using extra options to datasette publish : datasette publish cloudrun mydatabase.db --service=my-service \ --title=""Title of my database"" \ --source=""Where the data originated"" \ --source_url=""http://www.example.com/"" You can also specify plugins you would like to install. For example, if you want to include the datasette-vega visualization plugin you can use the following: datasette publish cloudrun mydatabase.db --service=my-service --install=datasette-vega If a plugin has any Secret configuration values you can use the --plugin-secret option to set those secrets at publish time. For example, using Heroku with datasette-auth-github you might run the following command: $ datasette publish heroku my_database.db \ --name my-heroku-app-demo \ --install=datasette-auth-github \ --plugin-secret datasette-auth-github client_id your_client_id \ --plugin-secret datasette-auth-github client_secret your_client_secret","[""Publishing data"", ""datasette publish""]","[{""href"": ""https://github.com/simonw/datasette-vega"", ""label"": ""datasette-vega""}, {""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}]" changelog:id88,changelog,id88,0.25.2 (2018-12-16),"datasette publish heroku now uses the python-3.6.7 runtime Added documentation on how to build the documentation Added documentation covering our release process Upgraded to pytest 4.0.2","[""Changelog""]",[] changelog:id87,changelog,id87,0.26 (2019-01-02),"datasette serve --reload now restarts Datasette if a database file changes on disk. datasette publish now now takes an optional --alias mysite.now.sh argument. This will attempt to set an alias after the deploy completes. Fixed a bug where the advanced CSV export form failed to include the currently selected filters ( #393 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/393"", ""label"": ""#393""}]" internals:datasette-add-database,internals,datasette-add-database,".add_database(db, name=None, route=None)","db - datasette.database.Database instance The database to be attached. name - string, optional The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name. route - string, optional This will be used in the URL path. If not specified, it will default to the same thing as the name . The datasette.add_database(db) method lets you add a new database to the current Datasette instance. The db parameter should be an instance of the datasette.database.Database class. For example: from datasette.database import Database datasette.add_database( Database( datasette, path=""path/to/my-new-database.db"", ) ) This will add a mutable database and serve it at /my-new-database . Use is_mutable=False to add an immutable database. .add_database() returns the Database instance, with its name set as the database.name attribute. Any time you are working with a newly added database you should use the return value of .add_database() , for example: db = datasette.add_database( Database(datasette, memory_name=""statistics"") ) await db.execute_write( ""CREATE TABLE foo(id integer primary key)"" )","[""Internals for plugins"", ""Datasette class""]",[] authentication:authentication-ds-actor-expiry,authentication,authentication-ds-actor-expiry,Including an expiry time,"ds_actor cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards. To include an expiry, add a ""e"" key to the cookie value containing a base62-encoded integer representing the timestamp when the cookie should expire. For example, here's how to set a cookie that expires after 24 hours: import time from datasette.utils import baseconv expires_at = int(time.time()) + (24 * 60 * 60) response = Response.redirect(""/"") response.set_cookie( ""ds_actor"", datasette.sign( { ""a"": {""id"": ""cleopaws""}, ""e"": baseconv.base62.encode(expires_at), }, ""actor"", ), ) The resulting cookie will encode data that looks something like this: { ""a"": { ""id"": ""cleopaws"" }, ""e"": ""1jjSji"" }","[""Authentication and permissions"", ""The ds_actor cookie""]",[] plugin_hooks:plugin-hook-prepare-jinja2-environment,plugin_hooks,plugin-hook-prepare-jinja2-environment,"prepare_jinja2_environment(env, datasette)","env - jinja2 Environment The template environment that is being prepared datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) This hook is called with the Jinja2 environment that is used to evaluate Datasette HTML templates. You can use it to do things like register custom template filters , for example: from datasette import hookimpl @hookimpl def prepare_jinja2_environment(env): env.filters[""uppercase""] = lambda u: u.upper() You can now use this filter in your custom templates like so: Table name: {{ table|uppercase }} This function can return an awaitable function if it needs to run any async code. Examples: datasette-edit-templates","[""Plugin hooks""]","[{""href"": ""http://jinja.pocoo.org/docs/2.10/api/#custom-filters"", ""label"": ""register custom\n template filters""}, {""href"": ""https://datasette.io/plugins/datasette-edit-templates"", ""label"": ""datasette-edit-templates""}]" internals:datasette-setting,internals,datasette-setting,.setting(key),"key - string The name of the setting, e.g. base_url . Returns the configured value for the specified setting . This can be a string, boolean or integer depending on the requested setting. For example: downloads_are_allowed = datasette.setting(""allow_download"")","[""Internals for plugins"", ""Datasette class""]",[] contributing:general-guidelines,contributing,general-guidelines,General guidelines,"main should always be releasable . Incomplete features should live in branches. This ensures that any small bug fixes can be quickly released. The ideal commit should bundle together the implementation, unit tests and associated documentation updates. The commit message should link to an associated issue. New plugin hooks should only be shipped if accompanied by a separate release of a non-demo plugin that uses them.","[""Contributing""]",[] internals:datasette-remove-database,internals,datasette-remove-database,.remove_database(name),"name - string The name of the database to be removed. This removes a database that has been previously added. name= is the unique name of that database.","[""Internals for plugins"", ""Datasette class""]",[] internals:datasette-get-database,internals,datasette-get-database,.get_database(name),"name - string, optional The name of the database - optional. Returns the specified database object. Raises a KeyError if the database does not exist. Call this method without an argument to return the first connected database.","[""Internals for plugins"", ""Datasette class""]",[] installation:installation-pipx,installation,installation-pipx,Using pipx,"pipx is a tool for installing Python software with all of its dependencies in an isolated environment, to ensure that they will not conflict with any other installed Python software. If you use Homebrew on macOS you can install pipx like this: brew install pipx pipx ensurepath Without Homebrew you can install it like so: python3 -m pip install --user pipx python3 -m pipx ensurepath The pipx ensurepath command configures your shell to ensure it can find commands that have been installed by pipx - generally by making sure ~/.local/bin has been added to your PATH . Once pipx is installed you can use it to install Datasette like this: pipx install datasette Then run datasette --version to confirm that it has been successfully installed.","[""Installation"", ""Advanced installation options""]","[{""href"": ""https://pipxproject.github.io/pipx/"", ""label"": ""pipx""}, {""href"": ""https://brew.sh/"", ""label"": ""Homebrew""}]" internals:datasette-plugin-config,internals,datasette-plugin-config,".plugin_config(plugin_name, database=None, table=None)","plugin_name - string The name of the plugin to look up configuration for. Usually this is something similar to datasette-cluster-map . database - None or string The database the user is interacting with. table - None or string The table the user is interacting with. This method lets you read plugin configuration values that were set in metadata.json . See Writing plugins that accept configuration for full details of how this method should be used. The return value will be the value from the configuration file - usually a dictionary. If the plugin is not configured the return value will be None .","[""Internals for plugins"", ""Datasette class""]",[] plugin_hooks:plugin-hook-publish-subcommand,plugin_hooks,plugin-hook-publish-subcommand,publish_subcommand(publish),"publish - Click publish command group The Click command group for the datasette publish subcommand This hook allows you to create new providers for the datasette publish command. Datasette uses this hook internally to implement the default cloudrun and heroku subcommands, so you can read their source to see examples of this hook in action. Let's say you want to build a plugin that adds a datasette publish my_hosting_provider --api_key=xxx mydatabase.db publish command. Your implementation would start like this: from datasette import hookimpl from datasette.publish.common import ( add_common_publish_arguments_and_options, ) import click @hookimpl def publish_subcommand(publish): @publish.command() @add_common_publish_arguments_and_options @click.option( ""-k"", ""--api_key"", help=""API key for talking to my hosting provider"", ) def my_hosting_provider( files, metadata, extra_options, branch, template_dir, plugins_dir, static, install, plugin_secret, version_note, secret, title, license, license_url, source, source_url, about, about_url, api_key, ): ... Examples: datasette-publish-fly , datasette-publish-vercel","[""Plugin hooks""]","[{""href"": ""https://github.com/simonw/datasette/tree/main/datasette/publish"", ""label"": ""their source""}, {""href"": ""https://datasette.io/plugins/datasette-publish-fly"", ""label"": ""datasette-publish-fly""}, {""href"": ""https://datasette.io/plugins/datasette-publish-vercel"", ""label"": ""datasette-publish-vercel""}]" changelog:new-plugin-hooks,changelog,new-plugin-hooks,New plugin hooks,"register_magic_parameters(datasette) can be used to define new types of magic canned query parameters. startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 ) canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 ) forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette-init"", ""label"": ""datasette-init""}, {""href"": ""https://github.com/simonw/datasette/issues/834"", ""label"": ""#834""}, {""href"": ""https://github.com/simonw/datasette-saved-queries"", ""label"": ""datasette-saved-queries""}, {""href"": ""https://github.com/simonw/datasette/issues/852"", ""label"": ""#852""}, {""href"": ""https://github.com/simonw/datasette/issues/812"", ""label"": ""#812""}]" plugin_hooks:plugin-hook-filters-from-request,plugin_hooks,plugin-hook-filters-from-request,"filters_from_request(request, database, table, datasette)","request - Request object The current HTTP request. database - string The name of the database. table - string The name of the table. datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. This hook runs on the table page, and can influence the where clause of the SQL query used to populate that page, based on query string arguments on the incoming request. The hook should return an instance of datasette.filters.FilterArguments which has one required and three optional arguments: return FilterArguments( where_clauses=[""id > :max_id""], params={""max_id"": 5}, human_descriptions=[""max_id is greater than 5""], extra_context={}, ) The arguments to the FilterArguments class constructor are as follows: where_clauses - list of strings, required A list of SQL fragments that will be inserted into the SQL query, joined by the and operator. These can include :named parameters which will be populated using data in params . params - dictionary, optional Additional keyword arguments to be used when the query is executed. These should match any :arguments in the where clauses. human_descriptions - list of strings, optional These strings will be included in the human-readable description at the top of the page and the page . extra_context - dictionary, optional Additional context variables that should be made available to the table.html template when it is rendered. This example plugin causes 0 results to be returned if ?_nothing=1 is added to the URL: from datasette import hookimpl from datasette.filters import FilterArguments @hookimpl def filters_from_request(self, request): if request.args.get(""_nothing""): return FilterArguments( [""1 = 0""], human_descriptions=[""NOTHING""] ) Example: datasette-leaflet-freedraw","[""Plugin hooks""]","[{""href"": ""https://datasette.io/plugins/datasette-leaflet-freedraw"", ""label"": ""datasette-leaflet-freedraw""}]" internals:datasette-add-message,internals,datasette-add-message,".add_message(request, message, type=datasette.INFO)","request - Request The current Request object message - string The message string type - constant, optional The message type - datasette.INFO , datasette.WARNING or datasette.ERROR Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages cookie. This method adds a message to that cookie. You can try out these messages (including the different visual styling of the three message types) using the /-/messages debugging tool.","[""Internals for plugins"", ""Datasette class""]",[] internals:datasette-absolute-url,internals,datasette-absolute-url,".absolute_url(request, path)","request - Request The current Request object path - string A path, for example /dbname/table.json Returns the absolute URL for the given path, including the protocol and host. For example: absolute_url = datasette.absolute_url( request, ""/dbname/table.json"" ) # Would return ""http://localhost:8001/dbname/table.json"" The current request object is used to determine the hostname and protocol that should be used for the returned URL. The force_https_urls configuration setting is taken into account.","[""Internals for plugins"", ""Datasette class""]",[] internals:internals-multiparams,internals,internals-multiparams,The MultiParams class,"request.args is a MultiParams object - a dictionary-like object which provides access to query string parameters that may have multiple values. Consider the query string ?foo=1&foo=2&bar=3 - with two values for foo and one value for bar . request.args[key] - string Returns the first value for that key, or raises a KeyError if the key is missing. For the above example request.args[""foo""] would return ""1"" . request.args.get(key) - string or None Returns the first value for that key, or None if the key is missing. Pass a second argument to specify a different default, e.g. q = request.args.get(""q"", """") . request.args.getlist(key) - list of strings Returns the list of strings for that key. request.args.getlist(""foo"") would return [""1"", ""2""] in the above example. request.args.getlist(""bar"") would return [""3""] . If the key is missing an empty list will be returned. request.args.keys() - list of strings Returns the list of available keys - for the example this would be [""foo"", ""bar""] . key in request.args - True or False You can use if key in request.args to check if a key is present. for key in request.args - iterator This lets you loop through every available key. len(request.args) - integer Returns the number of keys.","[""Internals for plugins""]",[] changelog:id70,changelog,id70,0.33 (2019-12-22),"rowid is now included in dropdown menus for filtering tables ( #636 ) Columns are now only suggested for faceting if they have at least one value with more than one record ( #638 ) Queries with no results now display ""0 results"" ( #637 ) Improved documentation for the --static option ( #641 ) asyncio task information is now included on the /-/threads debug page Bumped Uvicorn dependency 0.11 You can now use --port 0 to listen on an available port New template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ( #654 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/636"", ""label"": ""#636""}, {""href"": ""https://github.com/simonw/datasette/issues/638"", ""label"": ""#638""}, {""href"": ""https://github.com/simonw/datasette/issues/637"", ""label"": ""#637""}, {""href"": ""https://github.com/simonw/datasette/issues/641"", ""label"": ""#641""}, {""href"": ""https://latest.datasette.io/fixtures/roadside_attractions?_context=1"", ""label"": ""https://latest.datasette.io/fixtures/roadside_attractions?_context=1""}, {""href"": ""https://github.com/simonw/datasette/issues/654"", ""label"": ""#654""}]" internals:datasette-unsign,internals,datasette-unsign,".unsign(value, namespace=""default"")","signed - any serializable type The signed string that was created using .sign(value, namespace=""default"") . namespace - string, optional The alternative namespace, if one was used. Returns the original, decoded object that was passed to .sign(value, namespace=""default"") . If the signature is not valid this raises a itsdangerous.BadSignature exception.","[""Internals for plugins"", ""Datasette class""]",[] full_text_search:configuring-fts-using-sqlite-utils,full_text_search,configuring-fts-using-sqlite-utils,Configuring FTS using sqlite-utils,"sqlite-utils is a CLI utility and Python library for manipulating SQLite databases. You can use it from Python code to configure FTS search, or you can achieve the same goal using the accompanying command-line tool . Here's how to use sqlite-utils to enable full-text search for an items table across the name and description columns: $ sqlite-utils enable-fts mydatabase.db items name description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/python-api.html#enabling-full-text-search"", ""label"": ""it from Python code""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/cli.html#configuring-full-text-search"", ""label"": ""using the accompanying command-line tool""}]" ecosystem:sqlite-utils,ecosystem,sqlite-utils,sqlite-utils,"sqlite-utils is a key building block for the wider Datasette ecosystem. It provides a collection of utilities for manipulating SQLite databases, both as a Python library and a command-line utility. Features include: Insert data into a SQLite database from JSON, CSV or TSV, automatically creating tables with the correct schema or altering existing tables to add missing columns. Configure tables for use with SQLite full-text search, including creating triggers needed to keep the search index up-to-date. Modify tables in ways that are not supported by SQLite's default ALTER TABLE syntax - for example changing the types of columns or selecting a new primary key for a table. Adding foreign keys to existing database tables. Extracting columns of data into a separate lookup table.","[""The Datasette Ecosystem""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}]" internals:datasette-render-template,internals,datasette-render-template,"await .render_template(template, context=None, request=None)","template - string, list of strings or jinja2.Template The template file to be rendered, e.g. my_plugin.html . Datasette will search for this file first in the --template-dir= location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates. If this is a list of template file names then the first one that exists will be loaded and rendered. If this is a Jinja Template object it will be used directly. context - None or a Python dictionary The context variables to pass to the template. request - request object or None If you pass a Datasette request object here it will be made available to the template. Renders a Jinja template using Datasette's preconfigured instance of Jinja and returns the resulting string. The template will have access to Datasette's default template functions and any functions that have been made available by other plugins.","[""Internals for plugins"", ""Datasette class""]","[{""href"": ""https://jinja.palletsprojects.com/en/2.11.x/api/#jinja2.Template"", ""label"": ""Template object""}, {""href"": ""https://jinja.palletsprojects.com/en/2.11.x/"", ""label"": ""Jinja template""}]" internals:datasette-sign,internals,datasette-sign,".sign(value, namespace=""default"")","value - any serializable type The value to be signed. namespace - string, optional An alternative namespace, see the itsdangerous salt documentation . Utility method for signing values, such that you can safely pass data to and from an untrusted environment. This is a wrapper around the itsdangerous library. This method returns a signed string, which can be decoded and verified using .unsign(value, namespace=""default"") .","[""Internals for plugins"", ""Datasette class""]","[{""href"": ""https://itsdangerous.palletsprojects.com/en/1.1.x/serializer/#the-salt"", ""label"": ""itsdangerous salt documentation""}, {""href"": ""https://itsdangerous.palletsprojects.com/"", ""label"": ""itsdangerous""}]"