rowid,title,content,sections_fts,rank
378,0.25.2 (2018-12-16),"datasette publish heroku now uses the python-3.6.7 runtime
Added documentation on how to build the documentation
Added documentation covering our release process
Upgraded to pytest 4.0.2",8,
377,0.26 (2019-01-02),"datasette serve --reload now restarts Datasette if a database file changes on disk.
datasette publish now now takes an optional --alias mysite.now.sh argument. This will attempt to set an alias after the deploy completes.
Fixed a bug where the advanced CSV export form failed to include the currently selected filters ( #393 )",8,
376,0.26.1 (2019-01-10),"/-/versions now includes SQLite compile_options ( #396 )
datasetteproject/datasette Docker image now uses SQLite 3.26.0 ( #397 )
Cleaned up some deprecation warnings under Python 3.7",8,
375,0.27 (2019-01-31),"New command: datasette plugins ( documentation ) shows you the currently installed list of plugins.
Datasette can now output newline-delimited JSON using the new ?_shape=array&_nl=on query string option.
Added documentation on The Datasette Ecosystem .
Now using Python 3.7.2 as the base for the official Datasette Docker image .",8,
374,0.27.1 (2019-05-09),"Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller.",8,
373,Small changes,"We now show the size of the database file next to the download link ( #172 )
New /-/databases introspection page shows currently connected databases ( #470 )
Binary data is no longer displayed on the table and row pages ( #442 - thanks, Russ Garrett)
New show/hide SQL links on custom query pages ( #415 )
The extra_body_script plugin hook now accepts an optional view_name argument ( #443 - thanks, Russ Garrett)
Bumped Jinja2 dependency to 2.10.1 ( #426 )
All table filters are now documented, and documentation is enforced via unit tests ( 2c19a27 )
New project guideline: master should stay shippable at all times! ( 31f36e1 )
Fixed a bug where sqlite_timelimit() occasionally failed to clean up after itself ( bac4e01 )
We no longer load additional plugins when executing pytest ( #438 )
Homepage now links to database views if there are less than five tables in a database ( #373 )
The --cors option is now respected by error pages ( #453 )
datasette publish heroku now uses the --include-vcs-ignore option, which means it works under Travis CI ( #407 )
datasette publish heroku now publishes using Python 3.6.8 ( 666c374 )
Renamed datasette publish now to datasette publish nowv1 ( #472 )
datasette publish nowv1 now accepts multiple --alias parameters ( 09ef305 )
Removed the datasette skeleton command ( #476 )
The documentation on how to build the documentation now recommends sphinx-autobuild",8,
372,Medium changes,"Datasette now conforms to the Black coding style ( #449 ) - and has a unit test to enforce this in the future
New Special table arguments :
?columnname__in=value1,value2,value3 filter for executing SQL IN queries against a table, see Table arguments ( #433 )
?columnname__date=yyyy-mm-dd filter which returns rows where the spoecified datetime column falls on the specified date ( 583b22a )
?tags__arraycontains=tag filter which acts against a JSON array contained in a column ( 78e45ea )
?_where=sql-fragment filter for the table view ( #429 )
?_fts_table=mytable and ?_fts_pk=mycolumn query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view ( #428 )
You can now pass the same table filter multiple times - for example, ?content__not=world&content__not=hello will return all rows where the content column is neither hello or world ( #288 )
You can now specify about and about_url metadata (in addition to source and license ) linking to further information about a project - see Source, license and about
New ?_trace=1 parameter now adds debug information showing every SQL query that was executed while constructing the page ( #435 )
datasette inspect now just calculates table counts, and does not introspect other database metadata ( #462 )
Removed /-/inspect page entirely - this will be replaced by something similar in the future, see #465
Datasette can now run against an in-memory SQLite database. You can do this by starting it without passing any files or by using the new --memory option to datasette serve . This can be useful for experimenting with SQLite queries that do not access any data, such as SELECT 1+1 or SELECT sqlite_version() .",8,
371,register_output_renderer plugins,"Russ Garrett implemented a new Datasette plugin hook called register_output_renderer ( #441 ) which allows plugins to create additional output renderers in addition to Datasette's default .json and .csv .
Russ's in-development datasette-geo plugin includes an example of this hook being used to output .geojson automatically converted from SpatiaLite.",8,
370,datasette publish cloudrun,"Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users.
The new datasette publish cloudrun command was contributed by Romain Primet ( #434 ) and publishes selected databases to a new Datasette instance running on Google Cloud Run.
See Publishing to Google Cloud Run for full documentation.",8,
369,"Faceting improvements, and faceting plugins","Datasette Facets provide an intuitive way to quickly summarize and interact with data. Previously the only supported faceting technique was column faceting, but 0.28 introduces two powerful new capabilities: facet-by-JSON-array and the ability to define further facet types using plugins.
Facet by array ( #359 ) is only available if your SQLite installation provides the json1 extension. Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns - useful for modelling things like tags without needing to break them out into a new table. See Facet by JSON array for more.
The new register_facet_classes() plugin hook ( #445 ) can be used to register additional custom facet classes. Each facet class should provide two methods: suggest() which suggests facet selections that might be appropriate for a provided SQL query, and facet_results() which executes a facet operation and returns results. Datasette's own faceting implementations have been refactored to use the same API as these plugins.",8,
368,Supporting databases that change,"From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail.
As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates.
So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process.
Making this change was a lot of work - see tracking tickets #418 , #419 and #420 . It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the ""content hash"" URLs Datasette has used in the past to optimize the performance of HTTP caches.
Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable.",8,
367,0.28 (2019-05-19),A salmagundi of new features!,8,
366,Small changes,"Databases published using datasette publish now open in Immutable mode . ( #469 )
?col__date= now works for columns containing spaces
Automatic label detection (for deciding which column to show when linking to a foreign key) has been improved. ( #485 )
Fixed bug where pagination broke when combined with an expanded foreign key. ( #489 )
Contributors can now run pip install -e .[docs] to get all of the dependencies needed to build the documentation, including cd docs && make livehtml support.
Datasette's dependencies are now all specified using the ~= match operator. ( #532 )
white-space: pre-wrap now used for table creation SQL. ( #505 )
Full list of commits between 0.28 and 0.29.",8,
365,?_through= for joins through many-to-many tables,"The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 )
This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release.",8,
364,Easier custom templates for table rows,"If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this:
{% for row in display_rows %}
{{ row[""title""] }}
{{ row[""description""] }}
Category: {{ row.display(""category_id"") }}
{% endfor %}
This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html .
See Custom templates for full details.",8,
363,Facet by date,"If a column contains datetime values, Datasette can now facet that column by date. ( #481 )",8,
362,Secret plugin configuration options,"Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable:
{
""plugins"": {
""datasette-auth-github"": {
""client_secret"": {
""$env"": ""GITHUB_CLIENT_SECRET""
}
}
}
}
These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 )",8,
361,New plugin hook: extra_template_vars,"The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).",8,
360,New plugin hook: asgi_wrapper,"The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 )
Two new plugins take advantage of this hook:
datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams .
datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance.",8,
359,ASGI,"ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic .
I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down .
The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.",8,
358,0.29 (2019-07-07),"ASGI, new plugin hooks, facet by date and much, much more...",8,
357,0.29.1 (2019-07-11),"Fixed bug with static mounts using relative paths which could lead to traversal exploits ( #555 ) - thanks Abdussamet Kocak!
Datasette can now be run as a module: python -m datasette ( #556 ) - thanks, Abdussamet Kocak!",8,
356,0.29.2 (2019-07-13),"Bumped Uvicorn to 0.8.4, fixing a bug where the query string was not included in the server logs. ( #559 )
Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. ( #558 )
Fixed bug where custom query names containing unicode characters caused errors.",8,
355,0.29.3 (2019-09-02),"Fixed implementation of CodeMirror on database page ( #560 )
Documentation typo fixes - thanks, Min ho Kim ( #561 )
Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms ( #570 ) - for compatibility with a recent change to sqlite-utils .",8,
354,0.30 (2019-10-18),"Added /-/threads debugging page
Allow EXPLAIN WITH... ( #583 )
Button to format SQL - thanks, Tobias Kunze ( #136 )
Sort databases on homepage by argument order - thanks, Tobias Kunze ( #585 )
Display metadata footer on custom SQL queries - thanks, Tobias Kunze ( #589 )
Use --platform=managed for publish cloudrun ( #587 )
Fixed bug returning non-ASCII characters in CSV ( #584 )
Fix for /foo v.s. /foo-bar bug ( #601 )",8,
353,0.30.1 (2019-10-30),"Fixed bug where ?_where= parameter was not persisted in hidden form fields ( #604 )
Fixed bug with .JSON representation of row pages - thanks, Chris Shaw ( #603 )",8,
352,0.30.2 (2019-11-02),"/-/plugins page now uses distribution name e.g. datasette-cluster-map instead of the name of the underlying Python package ( datasette_cluster_map ) ( #606 )
Array faceting is now only suggested for columns that contain arrays of strings ( #562 )
Better documentation for the --host argument ( #574 )
Don't show None with a broken link for the label on a nullable foreign key ( #406 )",8,
351,0.31 (2019-11-11),"This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5.
If you are still running Python 3.5 you should stick with 0.30.2 , which you can install like this:
pip install datasette==0.30.2
Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze ( #602 )
New ?column__notin=x,y,z filter for table views ( #614 )
Table view now uses select col1, col2, col3 instead of select *
Database filenames can now contain spaces - thanks, Tobias Kunze ( #590 )
Removed obsolete ?_group_count=col feature ( #504 )
Improved user interface and documentation for datasette publish cloudrun ( #608 )
Tables with indexes now show the CREATE INDEX statements on the table page ( #618 )
Current version of uvicorn is now shown on /-/versions
Python 3.8 is now supported! ( #622 )
Python 3.5 is no longer supported.",8,
350,0.31.1 (2019-11-12),Deployments created using datasette publish now use python:3.8 base Docker image ( #629 ),8,
349,0.31.2 (2019-11-13),"Fixed a bug where datasette publish heroku applications failed to start ( #633 )
Fix for datasette publish with just --source_url - thanks, Stanley Zheng ( #572 )
Deployments to Heroku now use Python 3.8.0 ( #632 )",8,
348,0.32 (2019-11-14),"Datasette now renders templates using Jinja async mode . This means plugins can provide custom template functions that perform asynchronous actions, for example the new datasette-template-sql plugin which allows custom templates to directly execute SQL queries and render their results. ( #628 )",8,
347,0.33 (2019-12-22),"rowid is now included in dropdown menus for filtering tables ( #636 )
Columns are now only suggested for faceting if they have at least one value with more than one record ( #638 )
Queries with no results now display ""0 results"" ( #637 )
Improved documentation for the --static option ( #641 )
asyncio task information is now included on the /-/threads debug page
Bumped Uvicorn dependency 0.11
You can now use --port 0 to listen on an available port
New template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ( #654 )",8,
346,0.34 (2020-01-29),"_search= queries are now correctly escaped using a new escape_fts() custom SQL function. This means you can now run searches for strings like park. without seeing errors. ( #651 )
Google Cloud Run is no longer in beta, so datasette publish cloudrun has been updated to work even if the user has not installed the gcloud beta components package. Thanks, Katie McLaughlin ( #660 )
datasette package now accepts a --port option for specifying which port the resulting Docker container should listen on. ( #661 )",8,
345,0.35 (2020-02-04),"Added five new plugins and one new conversion tool to the The Datasette Ecosystem .
The Datasette class has a new render_template() method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library.
You can now execute SQL queries that start with a -- comment - thanks, Jay Graves ( #653 )",8,
344,0.36 (2020-02-21),"The datasette object passed to plugins now has API documentation: Datasette class . ( #576 )
New methods on datasette : .add_database() and .remove_database() - documentation . ( #671 )
prepare_connection() plugin hook now takes optional datasette and database arguments - prepare_connection(conn, database, datasette) . ( #678 )
Added three new plugins and one new conversion tool to the The Datasette Ecosystem .",8,
343,0.37 (2020-02-25),"Plugins now have a supported mechanism for writing to a database, using the new .execute_write() and .execute_write_fn() methods. Documentation . ( #682 )
Immutable databases that have had their rows counted using the inspect command now use the calculated count more effectively - thanks, Kevin Keogh. ( #666 )
--reload no longer restarts the server if a database file is modified, unless that database was opened immutable mode with -i . ( #494 )
New ?_searchmode=raw option turns off escaping for FTS queries in ?_search= allowing full use of SQLite's FTS5 query syntax . ( #676 )",8,
342,0.37.1 (2020-03-02),"Don't attempt to count table rows to display on the index page for databases > 100MB. ( #688 )
Print exceptions if they occur in the write thread rather than silently swallowing them.
Handle the possibility of scope[""path""] being a string rather than bytes
Better documentation for the extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook.",8,
341,0.38 (2020-03-08),"The Docker build of Datasette now uses SQLite 3.31.1, upgraded from 3.26. ( #695 )
datasette publish cloudrun now accepts an optional --memory=2Gi flag for setting the Cloud Run allocated memory to a value other than the default (256Mi). ( #694 )
Fixed bug where templates that shipped with plugins were sometimes not being correctly loaded. ( #697 )",8,
340,0.39 (2020-03-24),"New base_url configuration setting for serving up the correct links while running Datasette under a different URL prefix. ( #394 )
New metadata settings ""sort"" and ""sort_desc"" for setting the default sort order for a table. See Setting a default sort order . ( #702 )
Sort direction arrow now displays by default on the primary key. This means you only have to click once (not twice) to sort in reverse order. ( #677 )
New await Request(scope, receive).post_vars() method for accessing POST form variables. ( #700 )
Plugin hooks documentation now links to example uses of each plugin. ( #709 )",8,
339,0.40 (2020-04-21),"Datasette Metadata can now be provided as a YAML file as an optional alternative to JSON. See Using YAML for metadata . ( #713 )
Removed support for datasette publish now , which used the the now-retired Zeit Now v1 hosting platform. A new plugin, datasette-publish-now , can be installed to publish data to Zeit ( now Vercel ) Now v2. ( #710 )
Fixed a bug where the extra_template_vars(request, view_name) plugin hook was not receiving the correct view_name . ( #716 )
Variables added to the template context by the extra_template_vars() plugin hook are now shown in the ?_context=1 debugging mode (see template_debug ). ( #693 )
Fixed a bug where the ""templates considered"" HTML comment was no longer being displayed. ( #689 )
Fixed a datasette publish bug where --plugin-secret would over-ride plugin configuration in the provided metadata.json file. ( #724 )
Added a new CSS class for customizing the canned query page. ( #727 )",8,
338,0.41 (2020-05-06),"You can now create custom pages within your Datasette instance using a custom template file. For example, adding a template file called templates/pages/about.html will result in a new page being served at /about on your instance. See the custom pages documentation for full details, including how to return custom HTTP headers, redirects and status codes. ( #648 )
Configuration directory mode ( #731 ) allows you to define a custom Datasette instance as a directory. So instead of running the following:
$ datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
You can instead arrange your files in a single directory called my-project and run this:
$ datasette my-project/
Also in this release:
New NOT LIKE table filter: ?colname__notlike=expression . ( #750 )
Datasette now has a pattern portfolio at /-/patterns - e.g. https://latest.datasette.io/-/patterns . This is a page that shows every Datasette user interface component in one place, to aid core development and people building custom CSS themes. ( #151 )
SQLite PRAGMA functions such as pragma_table_info(tablename) are now allowed in Datasette SQL queries. ( #761 )
Datasette pages now consistently return a content-type of text/html; charset=utf-8"" . ( #752 )
Datasette now handles an ASGI raw_path value of None , which should allow compatibility with the Mangum adapter for running ASGI apps on AWS Lambda. Thanks, Colin Dellow. ( #719 )
Installation documentation now covers how to Using pipx . ( #756 )
Improved the documentation for Full-text search . ( #748 )",8,
337,0.42 (2020-05-08),"A small release which provides improved internal methods for use in plugins, along with documentation. See #685 .
Added documentation for db.execute() , see await db.execute(sql, ...) .
Renamed db.execute_against_connection_in_thread() to db.execute_fn() and made it a documented method, see await db.execute_fn(fn) .
New results.first() and results.single_value() methods, plus documentation for the Results class - see Results .",8,
336,0.43 (2020-05-28),"The main focus of this release is a major upgrade to the register_output_renderer(datasette) plugin hook, which allows plugins to provide new output formats for Datasette such as datasette-atom and datasette-ics .
Redesign of register_output_renderer(datasette) to provide more context to the render callback and support an optional ""can_render"" callback that controls if a suggested link to the output format is provided. ( #581 , #770 )
Visually distinguish float and integer columns - useful for figuring out why order-by-column might be returning unexpected results. ( #729 )
The Request object , which is passed to several plugin hooks, is now documented. ( #706 )
New metadata.json option for setting a custom default page size for specific tables and views, see Setting a custom page size . ( #751 )
Canned queries can now be configured with a default URL fragment hash, useful when working with plugins such as datasette-vega , see Additional canned query options . ( #706 )
Fixed a bug in datasette publish when running on operating systems where the /tmp directory lives in a different volume, using a backport of the Python 3.8 shutil.copytree() function. ( #744 )
Every plugin hook is now covered by the unit tests, and a new unit test checks that each plugin hook has at least one corresponding test. ( #771 , #773 )",8,
335,The road to Datasette 1.0,"I've assembled a milestone for Datasette 1.0 . The focus of the 1.0 release will be the following:
Signify confidence in the quality/stability of Datasette
Give plugin authors confidence that their plugins will work for the whole 1.x release cycle
Provide the same confidence to developers building against Datasette JSON APIs
If you have thoughts about what you would like to see for Datasette 1.0 you can join the conversation on issue #519 .",8,
334,Smaller changes,"New internals documentation for Request object and Response class . ( #706 )
request.url now respects the force_https_urls config setting. closes ( #781 )
request.args.getlist() returns [] if missing. Removed request.raw_args entirely. ( #774 )
New datasette.get_database() method.
Added _ prefix to many private, undocumented methods of the Datasette class. ( #576 )
Removed the db.get_outbound_foreign_keys() method which duplicated the behaviour of db.foreign_keys_for_table() .
New await datasette.permission_allowed() method.
/-/actor debugging endpoint for viewing the currently authenticated actor.
New request.cookies property.
/-/plugins endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1
request.post_vars() method no longer discards empty values.
New ""params"" canned query key for explicitly setting named parameters, see Canned query parameters . ( #797 )
request.args is now a MultiParams object.
Fixed a bug with the datasette plugins command. ( #802 )
Nicer pattern for using make_app_client() in tests. ( #395 )
New request.actor property.
Fixed broken CSS on nested 404 pages. ( #777 )
New request.url_vars property. ( #822 )
Fixed a bug with the python tests/fixtures.py command for outputting Datasette's testing fixtures database and plugins. ( #804 )
datasette publish heroku now deploys using Python 3.8.3.
Added a warning that the register_facet_classes() hook is unstable and may change in the future. ( #830 )
The {""$env"": ""ENVIRONMENT_VARIBALE""} mechanism (see Secret configuration values ) now works with variables inside nested lists. ( #837 )",8,
333,register_routes() plugin hooks,"Plugins can now register new views and routes via the register_routes(datasette) plugin hook ( #819 ). View functions can be defined that accept any of the current datasette object, the current request , or the ASGI scope , send and receive objects.",8,
332,Cookie methods,"Plugins can now use the new response.set_cookie() method to set cookies.
A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.",8,
331,CSRF protection,"Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection ( #798 ). This applies automatically to any POST request, which means plugins need to include a csrftoken in any POST forms that they render. They can do that like so:
",8,
330,Signed values and secrets,"Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace=""default"") and .unsign(value, namespace=""default"") .
Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret option or with a DATASETTE_SECRET environment variable. See Configuring the secret for more details.
You can also set a secret when you deploy Datasette using datasette publish or datasette package - see Using secrets with datasette publish .
Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.",8,
329,Flash messages,"Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system ( #790 ) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see .add_message(request, message, type=datasette.INFO) for details.
You can try out the new messages using the /-/messages debug tool, for example at https://latest.datasette.io/-/messages",8,
328,Writable canned queries,"Datasette's Canned queries feature lets you define SQL queries in metadata.json which can then be executed by users visiting a specific URL. https://latest.datasette.io/fixtures/neighborhood_search for example.
Canned queries were previously restricted to SELECT , but Datasette 0.44 introduces the ability for canned queries to execute INSERT or UPDATE queries as well, using the new ""write"": true property ( #800 ):
{
""databases"": {
""dogs"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true
}
}
}
}
}
See Writable canned queries for more details.",8,
327,Permissions,"Datasette also now has a built-in concept of Permissions . The permissions system answers the following question:
Is this actor allowed to perform this action , optionally against this particular resource ?
You can use the new ""allow"" block syntax in metadata.json (or metadata.yaml ) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db database to the ""root"" user:
{
""databases"": {
""fixtures"": {
""allow"": {
""id"" ""root""
}
}
}
}
See Defining permissions with ""allow"" blocks for more details.
Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook.
A new debug page at /-/permissions shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug permission. ( #788 )",8,
326,Authentication,"Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github .
0.44 introduces Authentication and permissions as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example.
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root command-line option, which outputs a one-time use URL to authenticate as a root actor ( #784 ):
$ datasette fixtures.db --root
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.",8,
325,0.44 (2020-06-11),"See also Datasette 0.44: The annotated release notes .
Authentication and permissions, writable canned queries, flash messages, new plugin hooks and more.",8,
324,Smaller changes,"Cascading view permissions - so if a user has view-table they can view the table page even if they do not have view-database or view-instance . ( #832 )
CSRF protection no longer applies to Authentication: Bearer token requests or requests without cookies. ( #835 )
datasette.add_message() now works inside plugins. ( #864 )
Workaround for ""Too many open files"" error in test runs. ( #846 )
Respect existing scope[""actor""] if already set by ASGI middleware. ( #854 )
New process for shipping Alpha and beta releases . ( #807 )
{{ csrftoken() }} now works when plugins render a template using datasette.render_template(..., request=request) . ( #863 )
Datasette now creates a single Request object and uses it throughout the lifetime of the current HTTP request. ( #870 )",8,
323,New plugin hooks,"register_magic_parameters(datasette) can be used to define new types of magic canned query parameters.
startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 )
canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 )
forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )",8,
322,Better plugin documentation,"The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 )
Plugins introduces Datasette's plugin system and describes how to install and configure plugins.
Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template.
Plugin hooks is a full list of detailed documentation for every Datasette plugin hook.
Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .",8,
321,Log out,"The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie.
A ""Log out"" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )",8,
320,Magic parameters for canned queries,"Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example:
insert into logs
(user_id, timestamp)
values
(:_actor_id, :_now_datetime_utc)
This inserts the currently authenticated actor ID and the current datetime. ( #842 )",8,
319,0.45 (2020-07-01),"See also Datasette 0.45: The annotated release notes .
Magic parameters for canned queries, a log out feature, improved plugin documentation and four new plugin hooks.",8,
318,0.46 (2020-08-09),"This release contains a security fix related to authenticated writable canned queries. If you are using this feature you should upgrade as soon as possible.
Security fix: CSRF tokens were incorrectly included in read-only canned query forms, which could allow them to be leaked to a sophisticated attacker. See issue 918 for details.
Datasette now supports GraphQL via the new datasette-graphql plugin - see GraphQL in Datasette with the new datasette-graphql plugin .
Principle git branch has been renamed from master to main . ( #849 )
New debugging tool: /-/allow-debug tool ( demo here ) helps test allow blocks against actors, as described in Defining permissions with ""allow"" blocks . ( #908 )
New logo for the documentation, and a new project tagline: ""An open source multi-tool for exploring and publishing data"".
Whitespace in column values is now respected on display, using white-space: pre-wrap . ( #896 )
New await request.post_body() method for accessing the raw POST body, see Request object . ( #897 )
Database file downloads now include a content-length HTTP header, enabling download progress bars. ( #905 )
File downloads now also correctly set the suggested file name using a content-disposition HTTP header. ( #909 )
tests are now excluded from the Datasette package properly - thanks, abeyerpath. ( #456 )
The Datasette package published to PyPI now includes sdist as well as bdist_wheel .
Better titles for canned query pages. ( #887 )
Now only loads Python files from a directory passed using the --plugins-dir option - thanks, Amjith Ramanujam. ( #890 )
New documentation section on Publishing to Vercel .",8,
317,0.47 (2020-08-11),"Datasette now has a GitHub discussions forum for conversations about the project that go beyond just bug reports and issues.
Datasette can now be installed on macOS using Homebrew! Run brew install simonw/datasette/datasette . See Using Homebrew . ( #335 )
Two new commands: datasette install name-of-plugin and datasette uninstall name-of-plugin . These are equivalent to pip install and pip uninstall but automatically run in the same virtual environment as Datasette, so users don't have to figure out where that virtual environment is - useful for installations created using Homebrew or pipx . See Installing plugins . ( #925 )
A new command-line option, datasette --get , accepts a path to a URL within the Datasette instance. It will run that request through Datasette (without starting a web server) and print out the response. See datasette --get for an example. ( #926 )",8,
316,0.47.1 (2020-08-11),Fixed a bug where the sdist distribution of Datasette was not correctly including the template files. ( #930 ),8,
315,0.47.2 (2020-08-12),Fixed an issue with the Docker image published to Docker Hub . ( #931 ),8,
314,0.47.3 (2020-08-15),The datasette --get command-line mechanism now ensures any plugins using the startup() hook are correctly executed. ( #934 ),8,
313,0.48 (2020-08-16),"Datasette documentation now lives at docs.datasette.io .
db.is_mutable property is now documented and tested, see Database introspection .
The extra_template_vars , extra_css_urls , extra_js_urls and extra_body_script plugin hooks now all accept the same arguments. See extra_template_vars(template, database, table, columns, view_name, request, datasette) for details. ( #939 )
Those hooks now accept a new columns argument detailing the table columns that will be rendered on that page. ( #938 )
Fixed bug where plugins calling db.execute_write_fn() could hang Datasette if the connection failed. ( #935 )
Fixed bug with the ?_nl=on output option and binary data. ( #914 )",8,
312,0.49 (2020-09-14),"See also Datasette 0.49: The annotated release notes .
Writable canned queries now expose a JSON API, see JSON API for writable canned queries . ( #880 )
New mechanism for defining page templates with custom path parameters - a template file called pages/about/{slug}.html will be used to render any requests to /about/something . See Path parameters for pages . ( #944 )
register_output_renderer() render functions can now return a Response . ( #953 )
New --upgrade option for datasette install . ( #945 )
New datasette --pdb option. ( #962 )
datasette --get exit code now reflects the internal HTTP status code. ( #947 )
New raise_404() template function for returning 404 errors. ( #964 )
datasette publish heroku now deploys using Python 3.8.5
Upgraded CodeMirror to 5.57.0. ( #948 )
Upgraded code style to Black 20.8b1. ( #958 )
Fixed bug where selected facets were not correctly persisted in hidden form fields on the table page. ( #963 )
Renamed the default error template from 500.html to error.html .
Custom error pages are now documented, see Custom error pages . ( #965 )",8,
311,0.49.1 (2020-09-15),Fixed a bug with writable canned queries that use magic parameters but accept no non-magic arguments. ( #967 ),8,
310,0.50 (2020-10-09),"The key new feature in this release is the column actions menu on the table page ( #891 ). This can be used to sort a column in ascending or descending order, facet data by that column or filter the table to just rows that have a value for that column.
Plugin authors can use the new datasette.client object to make internal HTTP requests from their plugins, allowing them to make use of Datasette's JSON API. ( #943 )
New Deploying Datasette documentation with guides for deploying Datasette on a Linux server using systemd or to hosting providers that support buildpacks . ( #514 , #997 )
Other improvements in this release:
Publishing to Google Cloud Run documentation now covers Google Cloud SDK options. Thanks, Geoffrey Hing. ( #995 )
New datasette -o option which opens your browser as soon as Datasette starts up. ( #970 )
Datasette now sets sqlite3.enable_callback_tracebacks(True) so that errors in custom SQL functions will display tracebacks. ( #891 )
Fixed two rendering bugs with column headers in portrait mobile view. ( #978 , #980 )
New db.table_column_details(table) introspection method for retrieving full details of the columns in a specific table, see Database introspection .
Fixed a routing bug with custom page wildcard templates. ( #996 )
datasette publish heroku now deploys using Python 3.8.6.
New datasette publish heroku --tar= option. ( #969 )
OPTIONS requests against HTML pages no longer return a 500 error. ( #1001 )
Datasette now supports Python 3.9.
See also Datasette 0.50: The annotated release notes .",8,
309,0.50.1 (2020-10-09),"Fixed a bug introduced in 0.50 where the export as JSON/CSV links on the table, row and query pages were broken. ( #1010 )",8,
308,0.50.2 (2020-10-09),Fixed another bug introduced in 0.50 where column header links on the table page were broken. ( #1011 ),8,
307,Smaller changes,"Wide tables shown within Datasette now scroll horizontally ( #998 ). This is achieved using a new
element which may impact the implementation of some plugins (for example this change to datasette-cluster-map ).
New debug-menu permission. ( #1068 )
Removed --debug option, which didn't do anything. ( #814 )
Link: HTTP header pagination. ( #1014 )
x button for clearing filters. ( #1016 )
Edit SQL button on canned queries, ( #1019 )
--load-extension=spatialite shortcut. ( #1028 )
scale-in animation for column action menu. ( #1039 )
Option to pass a list of templates to .render_template() is now documented. ( #1045 )
New datasette.urls.static_plugins() method. ( #1033 )
datasette -o option now opens the most relevant page. ( #976 )
datasette --cors option now enables access to /database.db downloads. ( #1057 )
Database file downloads now implement cascading permissions, so you can download a database if you have view-database-download permission even if you do not have permission to access the Datasette instance. ( #1058 )
New documentation on Designing URLs for your plugin . ( #1053 )",8,
306,Running Datasette behind a proxy,"The base_url configuration option is designed to help run Datasette on a specific path behind a proxy - for example if you want to run an instance of Datasette at /my-datasette/ within your existing site's URL hierarchy, proxied behind nginx or Apache.
Support for this configuration option has been greatly improved ( #1023 ), and guidelines for using it are now available in a new documentation section on Running Datasette behind a proxy . ( #1027 )",8,
305,URL building,"The new datasette.urls family of methods can be used to generate URLs to key pages within the Datasette interface, both within custom templates and Datasette plugins. See Building URLs within plugins for more details. ( #904 )",8,
304,Binary data,"SQLite tables can contain binary data in BLOB columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. ( #1036 and #1034 ).",8,
303,Plugins can now add links within Datasette,"A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs , editing table schemas or configuring full-text search .
Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook ( #1064 ) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook ( #1066 ) adds links to a new ""table actions"" menu on the table page.
The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.",8,
302,New visual design,"Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )",8,
301,0.51 (2020-10-31),"A new visual design, plugin hooks for adding navigation options, better handling of binary data, URL building utility methods and better support for running Datasette behind a proxy.",8,
300,0.51.1 (2020-10-31),Improvements to the new Binary data documentation page.,8,
299,0.52 (2020-11-28),"This release includes a number of changes relating to an internal rebranding effort: Datasette's configuration mechanism (things like datasette --config default_page_size:10 ) has been renamed to settings .
New --setting default_page_size 10 option as a replacement for --config default_page_size:10 (note the lack of a colon). The --config option is deprecated but will continue working until Datasette 1.0. ( #992 )
The /-/config introspection page is now /-/settings , and the previous page redirects to the new one. ( #1103 )
The config.json file in Configuration directory mode is now called settings.json . ( #1104 )
The undocumented datasette.config() internal method has been replaced by a documented .setting(key) method. ( #1107 )
Also in this release:
New plugin hook: database_actions(datasette, actor, database, request) , which adds menu items to a new cog menu shown at the top of the database page. ( #1077 )
datasette publish cloudrun has a new --apt-get-install option that can be used to install additional Ubuntu packages as part of the deployment. This is useful for deploying the new datasette-ripgrep plugin . ( #1110 )
Swept the documentation to remove words that minimize involved difficulty. ( #1089 )
And some bug fixes:
Foreign keys linking to rows with blank label columns now display as a hyphen, allowing those links to be clicked. ( #1086 )
Fixed bug where row pages could sometimes 500 if the underlying queries exceeded a time limit. ( #1088 )
Fixed a bug where the table action menu could appear partially obscured by the edge of the page. ( #1084 )",8,
298,0.52.1 (2020-11-29),"Documentation on Testing plugins now recommends using datasette.client . ( #1102 )
Fix bug where compound foreign keys produced broken links. ( #1098 )
datasette --load-module=spatialite now also checks for /usr/local/lib/mod_spatialite.so . Thanks, Dan Peterson. ( #1114 )",8,
297,0.52.2 (2020-12-02),"Generated columns from SQLite 3.31.0 or higher are now correctly displayed. ( #1116 )
Error message if you attempt to open a SpatiaLite database now suggests using --load-extension=spatialite if it detects that the extension is available in a common location. ( #1115 )
OPTIONS requests against the /database page no longer raise a 500 error. ( #1100 )
Databases larger than 32MB that are published to Cloud Run can now be downloaded. ( #749 )
Fix for misaligned cog icon on table and database pages. Thanks, Abdussamet Koçak. ( #1121 )",8,
296,0.52.3 (2020-12-03),Fixed bug where static assets would 404 for Datasette installed on ARM Amazon Linux. ( #1124 ),8,
295,0.52.4 (2020-12-05),"Show pysqlite3 version on /-/versions , if installed. ( #1125 )
Errors output by Datasette (e.g. for invalid SQL queries) now go to stderr , not stdout . ( #1131 )
Fix for a startup error on windows caused by unnecessary from os import EX_CANTCREAT - thanks, Abdussamet Koçak. ( #1094 )",8,
294,0.52.5 (2020-12-09),Fix for error caused by combining the _searchmode=raw and ?_search_COLUMN parameters. ( #1134 ),8,
293,0.53 (2020-12-10),"Datasette has an official project website now, at https://datasette.io/ . This release mainly updates the documentation to reflect the new site.
New ?column__arraynotcontains= table filter. ( #1132 )
datasette serve has a new --create option, which will create blank database files if they do not already exist rather than exiting with an error. ( #1135 )
New ?_header=off option for CSV export which omits the CSV header row, documented here . ( #1133 )
""Powered by Datasette"" link in the footer now links to https://datasette.io/ . ( #1138 )
Project news no longer lives in the README - it can now be found at https://datasette.io/news . ( #1137 )",8,
292,Other changes,"Datasette can now open multiple database files with the same name, e.g. if you run datasette path/to/one.db path/to/other/one.db . ( #509 )
datasette publish cloudrun now sets force_https_urls for every deployment, fixing some incorrect http:// links. ( #1178 )
Fixed a bug in the example nginx configuration in Running Datasette behind a proxy . ( #1091 )
The Datasette Ecosystem documentation page has been reduced in size in favour of the datasette.io tools and plugins directories. ( #1182 )
The request object now provides a request.full_path property, which returns the path including any query string. ( #1184 )
Better error message for disallowed PRAGMA clauses in SQL queries. ( #1185 )
datasette publish heroku now deploys using python-3.8.7 .
New plugin testing documentation on Testing outbound HTTP calls with pytest-httpx . ( #1198 )
All ?_* query string parameters passed to the table page are now persisted in hidden form fields, so parameters such as ?_size=10 will be correctly passed to the next page when query filters are changed. ( #1194 )
Fixed a bug loading a database file called test-database (1).sqlite . ( #1181 )",8,
291,Code formatting with Black and Prettier,"Datasette adopted Black for opinionated Python code formatting in June 2019. Datasette now also embraces Prettier for JavaScript formatting, which like Black is enforced by tests in continuous integration. Instructions for using these two tools can be found in the new section on Code formatting in the contributors documentation. ( #1167 )",8,
290,JavaScript modules,"JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords.
To use modules, JavaScript needs to be included in
You can also specify a SRI (subresource integrity hash) for these assets:
{
""extra_css_urls"": [
{
""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"",
""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI""
}
],
""extra_js_urls"": [
{
""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"",
""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=""
}
]
}
This will produce:
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using www.srihash.org
Items in ""extra_js_urls"" can specify ""module"": true if they reference JavaScript that uses JavaScript modules . This configuration:
{
""extra_js_urls"": [
{
""url"": ""https://example.datasette.io/module.js"",
""module"": true
}
]
}
Will produce this HTML:
",8,
200,Custom pages and templates,Datasette provides a number of ways of customizing the way data is displayed.,8,
199,Upgrading CodeMirror,"Datasette bundles CodeMirror for the SQL editing interface, e.g. on this page . Here are the steps for upgrading to a new version of CodeMirror:
Download and extract latest CodeMirror zip file from https://codemirror.net/codemirror.zip
Rename lib/codemirror.js to codemirror-5.57.0.js (using latest version number)
Rename lib/codemirror.css to codemirror-5.57.0.css
Rename mode/sql/sql.js to codemirror-5.57.0-sql.js
Edit both JavaScript files to make the top license comment a /* */ block instead of multiple // lines
Minify the JavaScript files like this:
npx uglify-js codemirror-5.57.0.js -o codemirror-5.57.0.min.js --comments '/LICENSE/'
npx uglify-js codemirror-5.57.0-sql.js -o codemirror-5.57.0-sql.min.js --comments '/LICENSE/'
Check that the LICENSE comment did indeed survive minification
Minify the CSS file like this:
npx clean-css-cli codemirror-5.57.0.css -o codemirror-5.57.0.min.css
Edit the _codemirror.html template to reference the new files
git rm the old files, git add the new files",8,
198,Releasing bug fixes from a branch,"If it's necessary to publish a bug fix release without shipping new features that have landed on main a release branch can be used.
Create it from the relevant last tagged release like so:
git branch 0.52.x 0.52.4
git checkout 0.52.x
Next cherry-pick the commits containing the bug fixes:
git cherry-pick COMMIT
Write the release notes in the branch, and update the version number in version.py . Then push the branch:
git push -u origin 0.52.x
Once the tests have completed, publish the release from that branch target using the GitHub Draft a new release form.
Finally, cherry-pick the commit with the release notes and version number bump across to main :
git checkout main
git cherry-pick COMMIT
git push",8,
197,Alpha and beta releases,"Alpha and beta releases are published to preview upcoming features that may not yet be stable - in particular to preview new plugin hooks.
You are welcome to try these out, but please be aware that details may change before the final release.
Please join discussions on the issue tracker to share your thoughts and experiences with on alpha and beta features that you try out.",8,
196,Release process,"Datasette releases are performed using tags. When a new release is published on GitHub, a GitHub Action workflow will perform the following:
Run the unit tests against all supported Python versions. If the tests pass...
Build a Docker image of the release and push a tag to https://hub.docker.com/r/datasetteproject/datasette
Re-point the ""latest"" tag on Docker Hub to the new image
Build a wheel bundle of the underlying Python source code
Push that new wheel up to PyPI: https://pypi.org/project/datasette/
To deploy new releases you will need to have push access to the main Datasette GitHub repository.
Datasette follows Semantic Versioning :
major.minor.patch
We increment major for backwards-incompatible releases. Datasette is currently pre-1.0 so the major version is always 0 .
We increment minor for new features.
We increment patch for bugfix releass.
Alpha and beta releases may have an additional a0 or b0 prefix - the integer component will be incremented with each subsequent alpha or beta.
To release a new version, first create a commit that updates the version number in datasette/version.py and the the changelog with highlights of the new version. An example commit can be seen here :
# Update changelog
git commit -m "" Release 0.51a1
Refs #1056, #1039, #998, #1045, #1033, #1036, #1034, #976, #1057, #1058, #1053, #1064, #1066"" -a
git push
Referencing the issues that are part of the release in the commit message ensures the name of the release shows up on those issue pages, e.g. here .
You can generate the list of issue references for a specific release by copying and pasting text from the release notes or GitHub changes-since-last-release view into this Extract issue numbers from pasted text tool.
To create the tag for the release, create a new release on GitHub matching the new version number. You can convert the release notes to Markdown by copying and pasting the rendered HTML into this Paste to Markdown tool .
Finally, post a news item about the release on datasette.io by editing the news.yaml file in that site's repository.",8,
195,Continuously deployed demo instances,"The demo instance at latest.datasette.io is re-deployed automatically to Google Cloud Run for every push to main that passes the test suite. This is implemented by the GitHub Actions workflow at .github/workflows/deploy-latest.yml .
Specific branches can also be set to automatically deploy by adding them to the on: push: branches block at the top of the workflow YAML file. Branches configured in this way will be deployed to a new Cloud Run service whether or not their tests pass.
The Cloud Run URL for a branch demo can be found in the GitHub Actions logs.",8,
194,Running Cog,"Some pages of documentation (in particular the CLI reference ) are automatically updated using Cog .
To update these pages, run the following command:
cog -r docs/*.rst",8,
193,Editing and building the documentation,"Datasette's documentation lives in the docs/ directory and is deployed automatically using Read The Docs .
The documentation is written using reStructuredText. You may find this article on The subset of reStructuredText worth committing to memory useful.
You can build it locally by installing sphinx and sphinx_rtd_theme in your Datasette development environment and then running make html directly in the docs/ directory:
# You may first need to activate your virtual environment:
source venv/bin/activate
# Install the dependencies needed to build the docs
pip install -e .[docs]
# Now build the docs
cd docs/
make html
This will create the HTML version of the documentation in docs/_build/html . You can open it in your browser like so:
open _build/html/index.html
Any time you make changes to a .rst file you can re-run make html to update the built documents, then refresh them in your browser.
For added productivity, you can use use sphinx-autobuild to run Sphinx in auto-build mode. This will run a local webserver serving the docs that automatically rebuilds them and refreshes the page any time you hit save in your editor.
sphinx-autobuild will have been installed when you ran pip install -e .[docs] . In your docs/ directory you can start the server by running the following:
make livehtml
Now browse to http://localhost:8000/ to view the documentation. Any edits you make should be instantly reflected in your browser.",8,
192,Prettier,"To install Prettier, install Node.js and then run the following in the root of your datasette repository checkout:
$ npm install
This will install Prettier in a node_modules directory. You can then check that your code matches the coding style like so:
$ npm run prettier -- --check
> prettier
> prettier 'datasette/static/*[!.min].js' ""--check""
Checking formatting...
[warn] datasette/static/plugins.js
[warn] Code style issues found in the above file(s). Forgot to run Prettier?
You can fix any problems by running:
$ npm run fix",8,
191,blacken-docs,"The blacken-docs command applies Black formatting rules to code examples in the documentation. Run it like this:
blacken-docs -l 60 docs/*.rst",8,
190,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout:
$ black . --check
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems:
$ black .
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.",8,
189,Code formatting,"Datasette uses opinionated code formatters: Black for Python and Prettier for JavaScript.
These formatters are enforced by Datasette's continuous integration: if a commit includes Python or JavaScript code that does not match the style enforced by those tools, the tests will fail.
When developing locally, you can verify and correct the formatting of your code using these tools.",8,
188,Debugging,"Any errors that occur while Datasette is running while display a stack trace on the console.
You can tell Datasette to open an interactive pdb debugger session if an error occurs using the --pdb option:
datasette --pdb fixtures.db",8,
187,Using fixtures,"To run Datasette itself, type datasette .
You're going to need at least one SQLite database. A quick way to get started is to use the fixtures database that Datasette uses for its own tests.
You can create a copy of that database by running this command:
python tests/fixtures.py fixtures.db
Now you can run Datasette against the new fixtures database like so:
datasette fixtures.db
This will start a server at http://127.0.0.1:8001/ .
Any changes you make in the datasette/templates or datasette/static folder will be picked up immediately (though you may need to do a force-refresh in your browser to see changes to CSS or JavaScript).
If you want to change Datasette's Python code you can use the --reload option to cause Datasette to automatically reload any time the underlying code changes:
datasette --reload fixtures.db
You can also use the fixtures.py script to recreate the testing version of metadata.json used by the unit tests. To do that:
python tests/fixtures.py fixtures.db fixtures-metadata.json
Or to output the plugins used by the tests, run this:
python tests/fixtures.py fixtures.db fixtures-metadata.json fixtures-plugins
Test tables written to fixtures.db
- metadata written to fixtures-metadata.json
Wrote plugin: fixtures-plugins/register_output_renderer.py
Wrote plugin: fixtures-plugins/view_name.py
Wrote plugin: fixtures-plugins/my_plugin.py
Wrote plugin: fixtures-plugins/messages_output_renderer.py
Wrote plugin: fixtures-plugins/my_plugin_2.py
Then run Datasette like this:
datasette fixtures.db -m fixtures-metadata.json --plugins-dir=fixtures-plugins/",8,
186,Running the tests,"Once you have done this, you can run the Datasette unit tests from inside your datasette/ directory using pytest like so:
pytest
You can run the tests faster using multiple CPU cores with pytest-xdist like this:
pytest -n auto -m ""not serial""
-n auto detects the number of available cores automatically. The -m ""not serial"" skips tests that don't work well in a parallel test environment. You can run those tests separately like so:
pytest -m ""serial""",8,
185,Setting up a development environment,"If you have Python 3.7 or higher installed on your computer (on OS X the quickest way to do this is using homebrew ) you can install an editable copy of Datasette using the following steps.
If you want to use GitHub to publish your changes, first create a fork of datasette under your own GitHub account.
Now clone that repository somewhere on your computer:
git clone git@github.com:YOURNAME/datasette
If you want to get started without creating your own fork, you can do this instead:
git clone git@github.com:simonw/datasette
The next step is to create a virtual environment for your project and use it to install Datasette's dependencies:
cd datasette
# Create a virtual environment in ./venv
python3 -m venv ./venv
# Now activate the virtual environment, so pip can install into it
source venv/bin/activate
# Install Datasette and its testing dependencies
python3 -m pip install -e '.[test]'
That last line does most of the work: pip install -e means ""install this package in a way that allows me to edit the source code in place"". The .[test] option means ""use the setup.py in this directory and install the optional testing dependencies as well"".",8,
184,General guidelines,"main should always be releasable . Incomplete features should live in branches. This ensures that any small bug fixes can be quickly released.
The ideal commit should bundle together the implementation, unit tests and associated documentation updates. The commit message should link to an associated issue.
New plugin hooks should only be shipped if accompanied by a separate release of a non-demo plugin that uses them.",8,
183,Contributing,"Datasette is an open source project. We welcome contributions!
This document describes how to contribute to Datasette core. You can also contribute to the wider Datasette ecosystem by creating new Plugins .",8,
182,Import shortcuts,"The following commonly used symbols can be imported directly from the datasette module:
from datasette import Response
from datasette import Forbidden
from datasette import NotFound
from datasette import hookimpl
from datasette import actor_matches_allow",8,
181,Tracing child tasks,"If your code uses a mechanism such as asyncio.gather() to execute code in additional tasks you may find that some of the traces are missing from the display.
You can use the trace_child_tasks() context manager to ensure these child tasks are correctly handled.
from datasette import tracer
with tracer.trace_child_tasks():
results = await asyncio.gather(
# ... async tasks here
)
This example uses the register_routes() plugin hook to add a page at /parallel-queries which executes two SQL queries in parallel using asyncio.gather() and returns their results.
from datasette import hookimpl
from datasette import tracer
@hookimpl
def register_routes():
async def parallel_queries(datasette):
db = datasette.get_database()
with tracer.trace_child_tasks():
one, two = await asyncio.gather(
db.execute(""select 1""),
db.execute(""select 2""),
)
return Response.json(
{
""one"": one.single_value(),
""two"": two.single_value(),
}
)
return [
(r""/parallel-queries$"", parallel_queries),
]
Adding ?_trace=1 will show that the trace covers both of those child tasks.",8,
180,datasette.tracer,"Running Datasette with --setting trace_debug 1 enables trace debug output, which can then be viewed by adding ?_trace=1 to the query string for any page.
You can see an example of this at the bottom of latest.datasette.io/fixtures/facetable?_trace=1 . The JSON output shows full details of every SQL query that was executed to generate the page.
The datasette-pretty-traces plugin can be installed to provide a more readable display of this information. You can see a demo of that here .
You can add your own custom traces to the JSON output using the trace() context manager. This takes a string that identifies the type of trace being recorded, and records any keyword arguments as additional JSON keys on the resulting trace object.
The start and end time, duration and a traceback of where the trace was executed will be automatically attached to the JSON object.
This example uses trace to record the start, end and duration of any HTTP GET requests made using the function:
from datasette.tracer import trace
import httpx
async def fetch_url(url):
with trace(""fetch-url"", url=url):
async with httpx.AsyncClient() as client:
return await client.get(url)",8,
179,Tilde encoding,"Datasette uses a custom encoding scheme in some places, called tilde encoding . This is primarily used for table names and row primary keys, to avoid any confusion between / characters in those values and the Datasette URLs that reference them.
Tilde encoding uses the same algorithm as URL percent-encoding , but with the ~ tilde character used in place of % .
Any character other than ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz0123456789_- will be replaced by the numeric equivalent preceded by a tilde. For example:
/ becomes ~2F
. becomes ~2E
% becomes ~25
~ becomes ~7E
Space becomes +
polls/2022.primary becomes polls~2F2022~2Eprimary
Note that the space character is a special case: it will be replaced with a + symbol.
datasette.utils. tilde_encode s : str str
Returns tilde-encoded string - for example /foo/bar -> ~2Ffoo~2Fbar
datasette.utils. tilde_decode s : str str
Decodes a tilde-encoded string, so ~2Ffoo~2Fbar -> /foo/bar",8,
178,await_me_maybe(value),"Utility function for calling await on a return value if it is awaitable, otherwise returning the value. This is used by Datasette to support plugin hooks that can optionally return awaitable functions. Read more about this function in The “await me maybe” pattern for Python asyncio .
async datasette.utils. await_me_maybe value : Any Any
If value is callable, call it. If awaitable, await it. Otherwise return it.",8,
177,parse_metadata(content),"This function accepts a string containing either JSON or YAML, expected to be of the format described in Metadata . It returns a nested Python dictionary representing the parsed data from that string.
If the metadata cannot be parsed as either JSON or YAML the function will raise a utils.BadMetadataError exception.
datasette.utils. parse_metadata content : str dict
Detects if content is JSON or YAML and parses it appropriately.",8,
176,The datasette.utils module,"The datasette.utils module contains various utility functions used by Datasette. As a general rule you should consider anything in this module to be unstable - functions and classes here could change without warning or be removed entirely between Datasette releases, without being mentioned in the release notes.
The exception to this rule is anythang that is documented here. If you find a need for an undocumented utility function in your own work, consider opening an issue requesting that the function you are using be upgraded to documented and supported status.",8,
175,The _internal database,"This API should be considered unstable - the structure of these tables may change prior to the release of Datasette 1.0.
Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases.
By default all actors are denied access to the view-database permission for the _internal database, so the database is not visible to anyone unless they sign in as root .
Plugins can access this database by calling db = datasette.get_database(""_internal"") and then executing queries using the Database API .
You can explore an example of this database by signing in as root to the latest.datasette.io demo instance and then navigating to latest.datasette.io/_internal .",8,
174,CSRF protection,"Datasette uses asgi-csrf to guard against CSRF attacks on form POST submissions. Users receive a ds_csrftoken cookie which is compared against the csrftoken form field (or x-csrftoken HTTP header) for every incoming request.
If your plugin implements a